Los Angeles Times

Tesla crash puts accountabi­lity in the spotlight

When cars are partly self-driving, some drivers may tune out.

- By Russ Mitchell

SAN FRANCISCO — Was Autopilot on when a Tesla Model S smashed into the back of a parked Culver City firetruck on the 405 Freeway on Monday in broad daylight?

That’s what the driver told police. Tesla Inc. — which would have such informatio­n because it monitors car and driver behavior over wireless networks — has not yet said yes or no.

The crash highlights a big problem facing the auto industry as evolution toward completely driverless cars continues: Many new cars are being equipped with robot systems that take over functions such as cruise control and steering but still require drivers to pay full attention.

Many drivers — perhaps most drivers — don’t always focus solely on the road. Distracted by smartphone­s and other devices, drivers are crashing, killing themselves and others, at increasing rates.

Fully driverless cars, with no need for steering wheels, are likely to prove safer than human drivers, in large part because they are paying strict and constant attention to driving. Such vehicles are beginning to appear on public highways in places such as Phoenix and Las Vegas.

But those systems are expensive and experiment­al. Semiautono­mous systems such as Autopilot are being installed in cars from Tesla, Audi, Volvo, Cadillac, Mercedes-Benz and others. They require humans and robots to share the driving duties.

Researcher­s with deep experience in human-machine interactio­n say it’s folly to think that won’t cause problems. Even if the human-robot team-up leads to safer roads on average, plenty of drivers will abuse the relationsh­ip, intentiona­lly or not, and events like Monday’s crash will make the news.

“There’s something we used to call split responsibi­lity,” said Hod Lipson, director of Columbia University’s Creative Machines Lab. “If you give the same responsi-

bility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that’s a dangerous thing.”

That’s also true for humans sharing tasks with robots, he said.

Engineerin­g researcher­s in the psychology department at the University of Utah are studying whether semiautono­mous driving technology will make things better or worse.

During the experiment­s, people are put in semiautono­mous driving simulators to measure their reaction times when something goes wrong. When subjects were distracted, average reaction time in the simulator almost doubled, researcher Kelly Funkhouser said.

The longer the subjects remained “cognitivel­y disengaged,” the longer their reaction times got. Some fell asleep.

Some automakers are using the technology to try to make shared duties safer. The driver-assist robot system available on the new Cadillac CT6 tracks driver eyeballs and sounds a warning if the driver is not watching the road. If the driver fails to respond properly, the system gradually slows the car and pulls it over.

Tesla cars rely on steering wheel sensors to track driver awareness. In other words, the car monitors what the driver’s hands are doing to determine the driver’s level of attention.

That can be undermined: Third parties sell warning defeat devices that attach to the steering wheel for handsfree and alarm-free Autopilot driving. Tesla warns drivers not to use such devices. It also makes clear to drivers that they’re expected to pay full attention to the road when using Autopilot.

The Tesla Autopilot system, like systems from Cadillac, Volvo, Mercedes-Benz, Audi, Lexus and others, fits into the Level 2 or Level 3 categories for semiautono­mous and autonomous cars set by the Society of Automotive Engineers. At Level 2, where most driver-assist technologi­es stand now, the driver is expected to pay full attention. With Level 3, the robot drives most of the time but not all the time. There is no clear line of demarcatio­n between those two levels.

Tesla — which is based in Palo Alto and led by Elon Musk — is equipping its new Model 3 with hardware it claims will support full autonomy, and it’s charging $8,000 for the suite. The company has offered no informatio­n on how long buyers will have to wait for software to support Level 4 robot drive, where no human driver is required at all.

Some companies are afraid that semiautono­mous driving and shared duties are causing accidents that draw media attention and turn the public against robot cars. Ford, for one, has said it will skip shared duties and, when the technology is ready, go straight to Level 4.

Waymo, the robot-car arm of Google parent company Alphabet Inc., likewise eschews semiautono­mous systems.

Already, it is running a ride-hailing service in and around Phoenix that is completely driverless. The passengers can sit back and watch the steering wheel turn all by itself, with no human in the driver’s seat.

 ?? Associated Press ?? THE DRIVER of a Model S said Autopilot was on when the car smashed into a firetruck Monday. Tesla Inc. has not yet confirmed or disputed this.
Associated Press THE DRIVER of a Model S said Autopilot was on when the car smashed into a firetruck Monday. Tesla Inc. has not yet confirmed or disputed this.

Newspapers in English

Newspapers from United States