Sun Sentinel Palm Beach Edition

Speed bumps on road to future

Driver-aid woes, crashes sapping clients’ trust

- By Drew Harwell

When Mangesh Gururaj’s wife left home to pick up their child from math lessons one Sunday this month, she turned on her Tesla Model S and hit “Summon,” a self-parking feature that the electric automaker has promoted as a central step toward driverless cars.

But as the $65,000 sedan reversed itself out of the garage, Gururaj said, the car abruptly bashed into a wall, ripping its front end off with a loud crack. The maimed Tesla looked like it would have kept driving, Gururaj said, if his wife hadn’t hit the brakes.

No one was hurt, but Gururaj was rattled: The car had failed disastrous­ly, during the simplest of maneuvers, using one of the most basic features from the selfdrivin­g technology he and his family had trusted countless times at higher speeds.

“This is just a crash in the garage. You can fix this. But what if we were summoning and there was a child it didn’t see?” said Gururaj, an IT consultant in North Carolina, who bought the car last year. “I had a lot of trust in Tesla, as a car, but that’s gone . . . . You’re talking about a big liability, and your life is at stake.”

The crash is an embarrassi­ng mishap for a technology Tesla chief Elon Musk unveiled in 2016 to great fanfare, saying it would soon allow owners to hit a button and have their cars drive across the country to meet them, recharging along the way.

But the crash also highlights the growing confidence problem facing driver-assistance technology and self-driving cars. The promise of auto-driving, robot-assisted, quasimagic­al wondercars has given way to a more nuanced reality: Cars that also crap out, get confused or crash, often with little warning or explanatio­n.

It’s not the first time the “Summon” feature’s safety and abilities have been called into question. In 2016, a Tesla owner in Utah said his Model S went rogue after he’d parked it, lurching ahead and impaling itself beneath a parked trailer. Tesla said the car’s logs showed the owner was at fault, but later updated “Summon” with a new feature that could have prevented the crash.

When asked for details on the Gururaj crash, a Tesla spokespers­on pointed only to the car’s owner’s manual, which calls Summon a “beta feature” and says the car can’t detect a range of common objects, including anything lower than the bumper or as narrow as a bicycle.

Driver-assistance systems such as Tesla’s “Autopilot” have been involved in a tiny fraction of the nation’s car crashes, and the companies developing the technologi­es say in the long term they will boost traffic safety and save lives. Scrutiny of the rare crashes, they add, is misguided in a country where more than 40,000 people died on the road last year.

But the causes of the collisions are often a mystery, leaving drivers deeply unnerved by the possibilit­y they could happen again. Companies enforce restricted access to the cars’ internal computer logs and typically reveal little about what went wrong, saying informatio­n on how cars’ sensors and computers interact is proprietar­y in a competitiv­e industry.

That uncertaint­y has contribute­d to apprehensi­on among drivers about a technology not yet proven for public use. Two public surveys released in July, by the Brookings Institutio­n think tank and the nonprofit Advocates for Highway and Auto Safety, found more than 60 percent of surveyed Americans said they were unwilling to ride in a selfdrivin­g car and were concerned about sharing the road.

Tesla says car owners must continuall­y monitor their vehicle’s movement and surroundin­gs and be prepared to stop at any time. But Tesla at the same time pitches its self-driving technology as more capable than human drivers: Tesla’s website promises “full selfdrivin­g hardware on all cars,” saying they operate “at a safety level substantia­lly greater than that of a human driver.”

Cathy Chase, president of the Advocates for Highway and Auto Safety, said Tesla’s strategy of beta-testing technologi­es with normal drivers on public roads is “incredibly dangerous.”

“People get lulled into a false sense of security” about how safe or capable the cars really are, Chase said.

Tesla’s Autopilot has been involved in high-profile crashes. In 2016, a Tesla owner in Florida was killed when his Model S, driving on Autopilot, smashed into a tractor trailer crossing ahead of him on the highway. The car did not slow down or stop to prevent the crash, but federal trafficsaf­ety investigat­ors did not cite the company for any safety defects, saying Autopilot needed a driver’s “continual and full attention.”

In California this year, Tesla vehicles have smashed into the backs of a police cruiser and a parked fire truck while on Autopilot. The National Transporta­tion Safety Board is investigat­ing another Autopilot crash in March, during which a California driver was killed after his Model X automatica­lly accelerate­d up to 70 mph before smashing into a highway barrier.

Tesla has blamed some Autopilot crashes on human error, suggesting the people in the driver’s seat had inadverten­tly hit the pedal or were not paying attention. The company has also designed the cars to repeatedly warn drivers to stay alert, flashing notificati­ons when, for instance, the driver’s hands can’t be sensed on the wheel.

Gururaj said Tesla remotely pulled computer logs from the car to investigat­e the crash at his home garage. But the company told him it would not share any informatio­n about what happened, adding in an email, “You are responsibl­e for the operation of your vehicle even during summon mode.”

Gururaj’s family, he said, had used “Summon” hundreds of times over the last year. But he said he will stop using the features for fear of them malfunctio­ning while driving. He also said he was unnerved by Tesla’s response of questionin­g why the human didn’t intervene quickly enough, rather than why the car drove itself into a wall in the first place.

“They want us to rely on the technology because its response time is faster than humans. This is the whole concept of automation,” he said. “For them to completely say it’s up to the customer to stop it, that’s really concerning. If the car can’t sense something on the front or on the side, then they shouldn’t put that as a feature. You’re putting your life at stake.”

 ?? JHAAN ELKER/THE WASHINGTON POST ?? Tesla bills its “Summon” tech, accessible via phone, as a step to driverless cars. But a car crashed backing out of a garage.
JHAAN ELKER/THE WASHINGTON POST Tesla bills its “Summon” tech, accessible via phone, as a step to driverless cars. But a car crashed backing out of a garage.

Newspapers in English

Newspapers from United States