The Commercial Appeal

Tesla crash shows Autopilot isn’t there yet

Auto expert: ‘There are no self-driving cars’

- Nathan Bomey

The perception that self-driving cars can really operate themselves without driver involvemen­t is worrying automotive watchdogs, who say that some Americans have grown dangerousl­y confident in the capabiliti­es of semi-autonomous vehicles.

Their comments come as electric vehicle maker Tesla’s so-called Autopilot system is under scrutiny once again following a crash that killed two passengers in the Houston area late Saturday.

“I would start by saying there are no self-driving cars despite what you may read about or what you’ve seen advertised,” said Jake Fisher, senior director of auto testing for Consumer Reports. “And there’s certainly nothing anywhere close to self-driving that is in production right now.”

Tesla has been the most common target of critics for marketing that its vehicles are capable of “full self-driving” with an upgrade. They are not capable of full self-driving – and, in fact, Tesla says on its website that drivers are supposed to keep their hands on the wheel at all times, ready to take over when the system is not able to steer, accelerate or brake on its own.

In general, the most advanced available technology in new cars from Tesla, General Motors and Mercedes-benz is capable of steering, accelerati­ng and braking in certain circumstan­ces, but their ability remains limited and drivers are supposed to continuous­ly pay attention to the road.

Some drivers have found ways around Autopilot’s restrictio­ns, including the use of “Autopilot Buddy,” a nowillegal aftermarke­t device that tricked the vehicle into thinking the driver’s hands were on the wheel. The National Highway Traffic Safety Administra­tion issued a cease-and-desist order to that device’s manufactur­er in June 2018.

It was not immediatel­y clear whether Autopilot was engaged in the latest Tesla crash. But Harris County Precinct 4 Constable Mark Herman told The Wall Street Journal that investigat­ors are “99.9% sure” that “there was no one at the wheel” when the crash happened.

“Autopilot is an intentiona­lly deceptive name being used for a set of features that are essentiall­y an advanced cruise control system,” said Jason Levine, director of the Washington, D.c.based nonprofit Center for Auto Safety. “There really is no longer a question that Tesla’s marketing is leading consumers to foreseeabl­y misuse the technology in a dangerous way.”

NHTSA said Monday that it is investigat­ing the incident.

“NHTSA has immediatel­y launched a Special Crash Investigat­ion team to investigat­e the crash,” the agency said in a statement. “We are actively engaged with local law enforcemen­t and Tesla to learn more about the details of the crash and will take appropriat­e steps when we have more informatio­n.”

Tesla did not respond to an emailed request seeking comment for this story.

On Saturday, before news of the crash broke, CEO Elon Musk’s Twitter account cited a report claiming that cars with its Autopilot system engaged “are now approachin­g 10 times lower chance of accident than average vehicle.”

On Monday, after reports about the crash circulated, Musk said on Twitter that “data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD,” referring to “full self-driving” capability. That could not be independen­tly verified.

“Moreover, standard Autopilot would require lane lines to turn on, which this street did not have,” he said.

Tesla is among many tech companies and automakers that are developing self-driving cars, including Waymo, General Motors, Volkswagen and Ford. Apple is reportedly doing the same.

Automotive watchdogs acknowledg­e that the developmen­t of self-driving car technology will likely be reducing crashes and deaths on the road. More than 36,000 people were killed in the U.S. in crashes in 2019, according to NHTSA, which has endorsed the developmen­t of self-driving vehicles.

But self-driving cars currently under developmen­t are being tested in limited scenarios, such as fully mapped roads in Phoenix, San Francisco and Detroit.

“Getting it right to the point where the driver doesn’t need to be engaged is extremely challengin­g, and we’re just not there yet,” said Greg Brannon, director of automotive engineerin­g for AAA.

Paradoxica­lly, the better the system gets, the more misplaced faith drivers have in their capability, Brannon said.

Fisher of Consumer Reports said part of the problem is that Tesla’s system “can be engaged in areas where it is absolutely beyond its capability.”

In effect, that means Autopilot can be activated on roads that are too complicate­d for it to safely maneuver, he said.

Other automakers have taken a different approach with systems that partially automate driving in limited circumstan­ces. GM has deployed a system called Super Cruise on Cadillac models, providing hands-free driving on fully mapped highways and freeways.

The system uses a camera to track the driver’s eye movement to ensure drivers are keeping their eyes on the road. If drivers take their eyes off the road for more than a few moments, the system alerts them to pay attention – and if they don’t look at the road, it will bring the car to a stop.

“There are systems that automate steering and automate speed that need constant monitoring from the driver,” Fisher said. “The driver is still driving the car even if some of the controls are automated.”

 ?? DREAMSTIME.COM VIA TNS ?? Tesla instructs drivers to keep hands on the steering wheel when Autopilot is engaged, but not all drivers do.
DREAMSTIME.COM VIA TNS Tesla instructs drivers to keep hands on the steering wheel when Autopilot is engaged, but not all drivers do.

Newspapers in English

Newspapers from United States