Sony’s Racing AI Just Beat the World’s Best Gran Turismo Drivers – Singularity Hub

Posted: February 15, 2022 at 5:06 am

Over the last several years, AIs have learned to best humans in progressively more complicated games, from board games like chess and Go to computer games like Pacman and Starcraft II (and lets not forget poker!). Now an AI created by Sony has overtaken humans in another popular and complex game: Gran Turismo. Besides being a feat in itself, the accomplishment could have real-world implications for training self-driving cars.

For those unfamiliar, Gran Turismo is a series of racing simulation games made for Sonys PlayStation consoles. The games creators aimed to bring as much real-world accuracy to its cars and driving as possible, from employing principles of physics to using actual recordings of cars engines. The realism of Gran Turismo comes from the detail that we put into the game, said Charles Ferreira, an engineer at Polyphony Digital, the creative studio behind Gran Turismo. All the details about the engine, the tires, the suspension, the tracks, the car model

Sony launched its AI division in April 2020 to do research in AI and robotics as they relate to entertainment. The division partnered with Polyphony Digital and the makers of PlayStation to develop Gran Turismo Sophy (GT Sophy), the AI that ended up beating the games best human players. A paper detailing how the system was trained and how its technique could be applied to real-world driving was published yesterday in Nature.

Putting the pedal to the metal is one skill you need to be good at Gran Turismo (or at racing cars in real life), but speed alone doesnt separate champs from runners-up. Strategy and etiquette are important too, from knowing when to pass another car versus waiting it out, to avoiding collisions while staying as close to other vehicles as possible, to where to go wide or cut in. As the papers authors put it, drivers must execute complex tactical maneuvers to pass or block opponents while operating their vehicles at their traction limits.

So how did an AI manage to tie these different skills together in a way that led to a winning streak?

GT Sophy was trained using deep reinforcement learning, a subfield of machine learning where an AI system or agent receives rewards for taking certain actions and is penalized for otherssimilar to the way humans learn through trial and errorwith the goal of maximizing its rewards.

GT Sophys creators focused on three areas in training the agent: car control (including understanding car dynamics and racing lines), racing tactics (making quick decisions around actions like slipstream passing, crossover passes, or blocking), and racing etiquette (following sportsmanship rules like avoiding at-fault collisions and respecting opponents driving lines).

Sony AIs engineers had to walk a fine line when creating GT Sophys reward function; the AI had to be aggressive without being reckless, so it received rewards for fast lap times and passing other cars while being penalized for cutting corners, colliding with a wall or another car, or skidding.

Researchers fed the system data from previous Gran Turismo games then set it loose to play, randomizing factors like starting speed, track position, and other players skill level for each run. GT Sophy was reportedly able to get around the track with just a few hours of training, though it took 45,000 total training hours for the AI to become a champ and beat the best human players.

Outracing human drivers so skillfully in a head-to-head competition represents a landmark achievement for AI, said Stanford automotive professor J. Christian Gerdes, who was not involved in the research, in a Nature editorial published with Sony AIs paper. GT Sophys success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today.

Though GT Sophys racing abilities wouldnt necessarily transfer well to real carsparticularly on regular roads or highways rather than a circular trackthe systems success can be seen as a step towards building AIs that understand the physics of the real world and interact with humans. Sonys research could be especially applicable to etiquette for self-driving cars, given that these boundaries are important despite being loosely defined (for example, its less egregious to cut someone off in a highway lane if you immediately speed up after doing so, as opposed to slowing down or maintaining your speed).

Given that self-driving cars have turned out to be a far more complex and slow-moving endeavor than initially anticipated, incorporating etiquette into their software may be low on the priority listbut it will ultimately be important for cars run by algorithms to avoid being the target of road rage from human drivers.

In the meantime, GT Sophy will continue refining its racing abilities, as it has plenty of room for improvement; for example, the AI consistently passes other cars with an impending time penalty, when it would often make more sense to wait for penalized cars to slow down instead.

Sony also says it plans to integrate GT Sophy into future Gran Turismo games, but hasnt yet disclosed a corresponding timeline.

Image Credit: Sony AI

Looking for ways to stay ahead of the pace of change? Rethink whats possible. Join a highly curated, exclusive cohort of 80 executives for Singularitys flagship Executive Program (EP), a five-day, fully immersive leadership transformation program that disrupts existing ways of thinking. Discover a new mindset, toolset and network of fellow futurists committed to finding solutions to the fast pace of change in the world. Click here to learn more and apply today!

See more here:

Sony's Racing AI Just Beat the World's Best Gran Turismo Drivers - Singularity Hub

Related Posts