7 Exercise Tips How to Stream 'Rabbit Hole' Roblox's AI Efforts 9 Household Items You're Not Cleaning Enough Better Sound on FaceTime Calls 'X-Ray Vision' for AR 9 Signs You Need Glasses When Your Tax Refund Will Arrive

Race the Sony Gran Turismo AI That Beat the Best Human Players

GT Sophy, trained on 20 PS5 consoles running for two weeks straight, arrives on Monday.

A human racer chases two AI-driven race cars, Sophy Violette and Sophy Verte, in Sony's Gran Turismo 7 video game. Four AI cars with varying abilities challenge a human racer in the game.
Sony; GIF by Stephen Stephen Shankland/CNET

In a few hours, Sony will let all Gran Turismo players take on GT Sophy, the AI opponent that defeated the best players of the car racing video game in the world.

A test period of what Sony calls GT Sophy Race Together mode begins at 10 p.m. PT on Monday (1 a.m. ET Tuesday) with a free update to Gran Turismo 7, the most recent version of the venerable PlayStation game. It'll let you race a quartet of GT Sophy AI opponents on four of GT7's dozens of courses through the end of March.

In short, Sony is leveling up its computer opponent, a move that should give human players a much better, more challenging game, said Peter Wurman, director of Sony AI America and leader of the GT Sophy project. The four tracks span a range of difficulty levels, and you can play one-on-one against GT Sophy in identical cars for the top challenge.

"We hope that this gives them a much more realistic driving racing experience so they have competitors that feel more human-like all the way up the skill levels," Wurman said. The standard GT7 computer opponent tops out at mid-level skills, but GT Sophy goes further without requiring players to enter the "wild West" of online play to find good human opponents, he said. 

Even though it's only a one month test, it's a big deal in the world of gaming -- and maybe beyond. Sony ultimately hopes to add AI opponents not just to Gran Turismo but other video games. And the same reason the AI skills work in a car racing video game could mean you'll encounter the technology in real life.

"It's an example of how some of these technologies can help empower humans," said Lareina Yee, a senior partner at consulting firm McKinsey. So perhaps forklift operators or farmers will learn from the bots. "These technologies can help accelerate training, especially where specific skills and expertise are required," she said.

AI is hot right now, with OpenAI's ChatGPT showing a new level of understanding and creativity, Microsoft building the chatbot technology into its Bing search engine and Google cooking up its own competitor, Bard. Those employ a technology called a large language model -- one that's produced embarrassing gaffes as well as impressive utility -- but Sony's AI uses a different approach called reinforcement learning.

Different AI techniques than ChatGPT

Reinforcement learning, like most AI techniques today, uses a foundation called a neural network inspired by the human brain. A training phase "teaches" the neural net to recognize patterns, then an inference phase uses that network to make decisions, like how fast a car should go around a corner.

An AI-driven race car, Sophy Violette, chases another bot, Sophy Verte, in Sony's Gran Turismo 7 videogame.

An AI-driven race car, Sophy Violette, chases another bot, Sophy Verte, in Sony's Gran Turismo 7 video game. Four AI cars with varying abilities challenge a human racer in the game.

Sony; Screenshot by Stephen Shankland/CNET

Sony trained its GT Sophy by pitting its AIs against each other on 20 PlayStations running around the clock, Wurman said. The bots had control over acceleration, braking, and steering, just like a human player. Instead of using the handheld controller or steering wheel accessory that human racers hold, the bots used a computer interface that fed control data into the GT7 game 10 times a second.

Reinforcement learning then handed out rewards to a bot for doing the right thing, like completing a lap or passing an opponent, Wurman said. Punishments discourage other actions like running into walls or colliding with other cars.

This reinforcement learning technique is what let Google subsidiary DeepMind win all 57 of Atari's classic video games and then later, outplaying humans in the more challenging StarCraft II real-time strategy game.

Not just academic

DeepMind researchers, who also built AIs that beat humans at the game of Go and tackled the famously difficult computing problem of predicting how strings of molecules fold into different proteins, have produced impressive academic research. But the GT Sophy technology not only claimed a prestigious place on the cover of the scientific journal Nature in 2022, but to show up in your living room game console.

For Sony, reinforcement learning means the GT Sophy can learn the subtleties of the game's physics, like how aerodynamics changes when following another car or leading the pack, said Michael Spranger, Sony AI's chief operating officer and author of several academic papers.

"Why it's interesting as an AI breakthrough is there are layers of complexity on top. You have to engage with other drivers, and so you have to learn to overtake," he said. "If you're behind a car, your top speed gets higher, but also it takes longer to brake."

But there's a higher level still: the unwritten rules of racing. "It's a very loosely defined thing, but you will get punished if you kind of don't adhere to etiquette," Spanger said. Human players would be irritated if GT Sophy violated the norms that evolved in the real world and simply avoid playing against GT Sophy.

After a single day of training, the GT Sophy bots were better than 95% of human players. With a further 10 or 12 days of training, the bots could beat the best human GT7 players.

An AI-driven race car, Sophy Violette, passes a human-driven competitor in Sony's Gran Turismo 7 videogame.

An AI-driven race car, Sophy Violette, passes a human-driven competitor in Sony's Gran Turismo 7 video game.

Sony; Screenshot by Stephen Shankland/CNET

GT Sophy will only last a few weeks this round, but expect the AI opponent to return at some point.

"This gives us a chance to let hundreds of thousands of players experience it and get feedback and find out how to revise it for the next time," Wurman said. "We of course have an interest in releasing this permanently to the game."

Watching a human race against GT Sophy

To show me, Wurman -- a computer science researcher who like Springer holds a Ph.D. -- sat down behind a Thrustmaster steering wheel and pedal car controllers, took off his shoes for better sensitivity, and fired up GT7. He drove a white Acura Integra, starting off in fourth place after three different colored GT Sophy cars and looking warily in the rear-view mirror at the best of the bots, Sophy Violette.

The cars all have the same Sophy AI driver, but Sony puts them in cars with varying abilities to offer a spectrum of challengers, Spranger said. The company is considering different driving personalities, something you'd see racing against humans, but isn't willing to detail its plans there.

Wurman passed the lesser Sophy cars. But sure enough, Sophy Violette passed him. As I watched, though, Wurman passed again and for the first time beat Sophy Violette. His exultant look of triumph was as real as from vanquishing any longtime human rival.

Editors' note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.