The Hamilton Spectator

Plans for self-driving cars have pitfall: the human brain

- JOAN LOWY

WASHINGTON — Experts say the developmen­t of self-driving cars over the coming decade depends on an unreliable assumption by many automakers: that the humans in them will be ready to step in and take control if the car’s systems fail.

Instead, experience with automation in other modes of transporta­tion like aviation and rail suggests that the strategy will lead to more deaths like that of a Florida Tesla driver in May.

Decades of research shows that people have a difficult time keeping their minds on boring tasks like monitoring systems that rarely fail and hardly ever require them to take action. The human brain continuall­y seeks stimulatio­n. If the mind isn’t engaged, it will wander until it finds something more interestin­g to think about. The more reliable the system, the more likely it is that attention will wane.

Automakers are in the process of adding increasing­ly automated systems that effectivel­y drive cars in some or most circumstan­ces, but still require the driver as a backup in case the vehicle encounters a situation unanticipa­ted by its engineers.

Tesla’s Autopilot, for example, can steer itself within a lane and speed up or slow down based on surroundin­g traffic or on the driver’s set speed. It can change lanes with a flip of its signal, automatica­lly apply brakes, or scan for parking spaces and parallel park on command.

Joshua Brown, a 40-year-old tech company owner from Canton, Ohio, who was an enthusiast­ic fan of the technology, was killed when neither he nor his Tesla Model S sedan’s Autopilot braked for a truck making a left turn on a highway near Gainesvill­e, according to federal investigat­ors and the automaker.

Tesla warns drivers to keep their hands on the wheel even though Autopilot is driving, or the vehicle will automatica­lly slow to a stop. A self-driving system Audi plans to introduce in its 2018 A7 monitors drivers’ head and eye movements, and automatica­lly slows the car if the driver’s attention is diverted.

But Brown’s failure to brake means he either didn’t see the truck in his path or saw it too late to respond — an indication he was relying on the automation and his mind was elsewhere, said Missy Cummings, director of Duke University’s Humans and Autonomy Laboratory. The truck driver said he had heard a Harry Potter video playing in the car after the crash.

“Drivers in these quasi- and partial modes of automation are a disaster in the making,” Cummings said. “If you have to rely on the human to see something and take action in anything less than several seconds, you are going to have an accident like we saw.”

Part of the problem is overconfid­ence in the technology causes people to think they can check out. Not long after Tesla introduced Autopilot, people were posting videos of car with the self-driving mode engaged cruising down treelined roads or even highways with no one in the driver’s seat.

 ?? SPENCER PLATT, GETTY IMAGES ?? Tesla’s Autopilot can steer itself within a lane and speed up or slow down based on surroundin­g traffic.
SPENCER PLATT, GETTY IMAGES Tesla’s Autopilot can steer itself within a lane and speed up or slow down based on surroundin­g traffic.

Newspapers in English

Newspapers from Canada