National Post

Tesla garage crash raises more safety questions

Self-park ‘a beta feature,’ company says

- Drew Harwell

When Mangesh Gururaj’s wife left home to pick up their child from math lessons one Sunday earlier this month, she turned on her

Tesla Inc. Model S and hit “Summon,” a self-parking feature that the electric automaker has promoted as a central step toward driverless cars.

But as the US$65,000 sedan reversed itself out of the garage, Gururaj said, the car abruptly bashed into a wall, ripping its front end off with a loud crack. The maimed Tesla looked like it would have kept driving, Gururaj said, if his wife hadn’t hit the brakes.

No one was hurt, but Gururaj was rattled: The car had failed disastrous­ly, during the simplest of manoeuvres, using one of the most basic features from the selfdrivin­g technology he and his family had trusted countless times at higher speeds.

“This is just a crash in the garage. You can fix this. But what if we were summoning and there was a child it didn’t see?” said Gururaj, an IT consultant in North Carolina, who bought the car last year. “I had a lot of trust in Tesla, as a car, but that’s gone ... You’re talking about a big liability, and your life is at stake.”

The crash is an embarrassi­ng mishap for a technology Tesla chief Elon Musk unveiled in 2016 to great fanfare, saying it would soon allow owners to hit a button and have their cars drive across the country to meet them, recharging on the way.

But the crash also highlights the growing confidence problem facing driver-assistance technology and self-driving cars. The promise of auto-driving, robot-assisted, quasi-magical wondercars has given way to a more nuanced reality: Cars that also crap out, get confused or crash, often with little warning or explanatio­n.

It’s not the first time the “Summon” feature’s safety and abilities have been called into question. In 2016, a Tesla owner in Utah said his Model S went rogue after he’d parked it, lurching ahead and impaling itself beneath a parked trailer. Tesla said the car’s logs showed the owner was at fault, but later updated “Summon” with a new feature that could have prevented the crash.

When asked for details on the Gururaj crash, a Tesla spokespers­on pointed only to the car’s owner’s manual, which calls Summon a “beta feature” and says the car can’t detect a range of common objects, including anything lower than the bumper or as narrow as a bicycle.

Driver-assistance systems such as Tesla’s “Autopilot” have been involved in a tiny fraction of the nation’s car crashes, and the companies developing the technologi­es say in the long term they will boost traffic safety and save lives. Scrutiny of the rare crashes, they add, is misguided in a country where more than 40,000 people died on the road last year.

But the causes of the collisions are often a mystery, leaving drivers like Gururaj deeply unnerved by the possibilit­y they could happen again. Companies enforce restricted access to the cars’ internal computer logs and typically reveal little about what went wrong, saying informatio­n on how cars’ sensors and computers interact is proprietar­y and should be kept secret in a competitiv­e industry.

That uncertaint­y has contribute­d to apprehensi­on among drivers about a technology not yet proven for public use. Two public surveys released in July, by the Brookings Institutio­n think tank and the non-profit Advocates for Highway and Auto Safety, found more than 60 per cent of surveyed Americans said they were unwilling to ride in a self-driving car and were concerned about sharing the road.

Tesla says car owners must continuall­y monitor their vehicle’s movement and surroundin­gs and be prepared to stop at any time. But Tesla at the same time pitches its self-driving technology as more capable than human drivers: Tesla’s website promises “full selfdrivin­g hardware on all cars,” saying they operate “at a safety level substantia­lly greater than that of a human driver.”

Cathy Chase, president of the Advocates for Highway and Auto Safety, said Tesla’s strategy of beta-testing technologi­es with normal drivers on public roads is “incredibly dangerous.”

“People get lulled into a false sense of security” about how safe or capable the cars really are, Chase said. “The Tesla approach is risky at best and deadly at worst.”

Tesla’s Autopilot has been involved in high-profile crashes. In 2016, a Tesla owner in Florida was killed when his Model S, driving on Autopilot, smashed into a tractor trailer crossing ahead of him on the highway. The car did not slow down or stop to prevent the crash, but federal traffic-safety investigat­ors did not cite the company for any safety defects, saying Autopilot needed a driver’s “continual and full attention.”

 ?? DAVID ZALUBOWSKI / THE ASSOCIATED PRESS FILES ?? A 2018 Model X at a Tesla showroom in Littleton, Colo. A $65,000 Model S smashed into a parking garage wall when the Summon self-parking feature was activated.
DAVID ZALUBOWSKI / THE ASSOCIATED PRESS FILES A 2018 Model X at a Tesla showroom in Littleton, Colo. A $65,000 Model S smashed into a parking garage wall when the Summon self-parking feature was activated.

Newspapers in English

Newspapers from Canada