Hartford Courant

AI straighten­s learning curve

Researcher­s hope some of tech learnings could eventually play a role in real world

- By Matt O’brien

Speed around a French village in the video game Gran Turismo and you might spot a Corvette behind you trying to catch your slipstream.

The technique of using the draft of an opponent’s race car to speed up and overtake them is one favored by skilled players of Playstatio­n’s realistic racing game.

But this Corvette driver is not being controlled by a human — it’s GT Sophy, a powerful artificial intelligen­ce agent built by Playstatio­n-maker Sony.

Gran Turismo players have been competing against computer-generated race cars since the franchise launched in the 1990s, but the new AI driver that was unleashed last week on Gran Turismo 7 is smarter and faster because it’s been trained using the latest AI methods.

“Gran Turismo had a built-in AI existing from the beginning of the game, but it has a very narrow band of performanc­e, and it isn’t very good,” said Michael Spranger, chief operating officer of Sony AI. “It’s very predictabl­e. Once you get past a certain level, it doesn’t really entice you anymore.”

But now, he said, “this AI is going to put up a fight.”

Visit an artificial intelligen­ce laboratory at universiti­es and companies like Sony, Google, Meta, Microsoft and Chatgpt-maker Openai, and it’s not unusual to find AI agents like Sophy racing cars, slinging angry birds at pigs, fighting epic interstell­ar battles or helping human gamers build new Minecraft worlds — all part of the job descriptio­n for computer systems trying to learn how to get smarter in games.

But in some instances, they are also trying to learn how to get smarter in

the real world. In a January paper, a University of Cambridge researcher who built an AI agent to control Pokemon characters argued it could “inspire all sorts of applicatio­ns that require team management under conditions of extreme uncertaint­y, including managing a team of doctors, robots or employees in an ever-changing environmen­t, like a pandemic-stricken region or a war zone.”

Initially, AI was used on games like checkers and chess to test at winning strategy games.

Now a new branch of research is more focused on performing open-ended tasks in complex worlds and interactin­g with humans, not just for the purpose of beating them.

Microsoft, which owns the popular Minecraft game franchise as well as the Xbox game system, has tasked AI agents with a variety of activities — from steering clear of lava to chopping trees and making furnaces.

Researcher­s hope some of their learnings could eventually play a role in real-world technology, such as how

to get a home robot to take on certain chores without having to program it to do so.

Amy Hoover, an assistant professor of informatic­s at the New Jersey Institute of Technology who’s built algorithms for the digital card game Hearthston­e, said “there really is a reason for studying games” but it is not always easy to explain.

“People aren’t always understand­ing that the point is about the optimizati­on method rather than the game,” she said.

The technology behind Sophy is based on an algorithmi­c method known as reinforcem­ent learning, which trains the system by rewarding it when it gets something right as it runs virtual races thousands of times.

“The reward is going to tell you that, ‘You’re making progress. This is good,’ or, ‘You’re off the track. Well, that’s not good,’ ” Spranger said.

Playstatio­n players will only get to try racing against Sophy until March 31, on a limited number of circuits, so it can get some feedback and go back into testing.

 ?? SONY INTERACTIV­E ENTERTAINM­ENT ?? The new AI driver that was unleashed last week on Grand Turismo 7 is smarter and faster.
SONY INTERACTIV­E ENTERTAINM­ENT The new AI driver that was unleashed last week on Grand Turismo 7 is smarter and faster.

Newspapers in English

Newspapers from United States