AI May Soon Put You in your Favorite Games

Date Entry
April 19, 2019

AI researchers from Facebook recently announced that they've developed a method to more finely cater video game characters to real-life people.  This method, which relies on videos of people going through basic motions, uses AI to understand the nuances of a person's movement and then translate it to a 3D character in your game.  Even though many modern video games offer highly customizable characters, none offer the ability to customize a character's movement.

Video footage has been used in the past to establish character movement - as is the case with Mortal Kombat's use of actors on a sound stage that were later digitized.  However, it's not been used before to customize characters to a player's preference.

The AI system utilizes two different neural networks that each video a five to eight minute video of a person going through the motions involved in a game, say, playing tennis.  The first neural network analyzes the human movement for the rendering engine while the second analyzes shadows and reflections to be rendered on a gameplay background.  As the technology is still new, the result isn't as smooth as current 3D game characters, but continuing work will hopefully improve that in time.

The tech isn't ultimately limited to video games either - once all of the kinks are ironed out the possibilities for lifelike renderings of real people are limitless.  Marketing, education, media...the possibilities are endless.

This article was based on an April 19, 2019 Gizmodo article by Andrew Liszewski

File To
Archived
File To
Current News