Nvidia uses AI to recreate Pac-Man on its 40th anniversary • Eurogamer.net

Nvidia reveals today that it has created a neural network that fully simulates the classic Pac-Man on the event of the Namco coin-op’s 40th anniversary. On the face of it, this might not sound like a big deal – Pac-Man is a relatively straightforward game that takes place in a simplistic, static environment, so using an AI to study its rules and perfectly replicate its game logic doesn’t sound outlandishly complex. Except that isn’t what’s happening. There is no engine here, no game logic and no traditional rasteriser used in the AI recreation of the game. Instead, everything generated at a per-pixel level is coming directly from the neural network, based on what it ‘knows’ about how Pac-Man works. This rendition of Pac-Man essentially plays out as an AI ‘thinks’ it should – and remarkably, it works.

Nvidia is working on something that it refers to as GameGAN (GAN meaning ‘generative adversarial network’). It works by using two neural networks working against one another – a generator and a discriminator. It’s the same kind of AI that has been used extensively for a number of applications, including the creation of AI-generated high resolution texture packs for retro games.

In the case of Nvidia’s GameGAN, the AI studied 50,000 games of Pac-Man before ‘learning’ how the game works generally, and how user input affects what happens on-screen. “This is the first research to emulate a game engine using GAN-based neural networks,” says
Seung-Wook Kim, an NVIDIA researcher and lead author on the project. “We wanted to see whether the AI could learn the rules of an environment just by looking at the screenplay of an agent moving through the game. And it did.”

To see this content please enable targeting cookies.

Nvidia’s trailer for its AI recreation of the classic Pac-Man.

Rather than having a human player indulge in 50,000 rounds of Pac-Man, Nvidia trained up a second AI to play through the game instead, giving GameGAN the data it needed to create its own AI rendition. This did present some issues, whoever. The player AI quickly learned the rules of Pac-Man and was consistently able to beat the game without dying, meaning that GameGAN was lacking some crucial data points – an aspect that did need to be corrected. However, at the end of the process, GameGAN possessed a neural network that knew how Pac-Man operated, what it looked like, how it responded to user inputs and how the various ghosts exhibited different behaviour patterns – all of which it could replicate in its own version.

Running this neural network sets the game in motion, with each frame generated by the AI’s knowledge of the game – down to each individual pixel in every frame. Although some small rendering errors can creep in based on erroneous inferencing, the AI is recreating a Pac-Man that allegedly runs just like the original game. The neural network plays out in real time and in a conference call with the engineers, we were told that new frames were generated every 20ms (which translates to 50fps).

Beyond recreating Pac-Man, Nvidia’s research wing obviously has big plans for AI. It says that GameGAN can study the same game running across different levels and then start to produce its own stages – potentially saving valuable time for developers. “We could eventually have an AI that can learn to mimic the rules of driving, the laws of physics, just by watching videos and seeing agents take actions in an environment. GameGAN is the first step toward that.” says Sanja Fidler, director of Nvidia’s Toronto research lab. The applications beyond gaming are virtually limitless, and Nvidia’s investment in AI for fully autonomous self-driving vehicles is well known.

So just how good is the GameGAN AI? To what extent is Pac-Man properly recreated via Nvidia’s neural network? The proof of the pudding is obviously in the tasting, with the firm planning to release its Pac-Man AI later this year as part of its AI playground showcase. I’m really looking forward to testing that out.

Be the first to comment

Leave a Reply

Your email address will not be published.