Business Day - Motor News

Senator wants ‘misleading’ Tesla Autopilot rebranded

NEWS/ Carmaker asked to change the name of a feature that leads some to believe the car can drive itself

- Reuters

AUS senator last week urged Tesla to rebrand its driver assistance system Autopilot, saying it has “an inherently misleading name” and is subject to potentiall­y dangerous misuse.

But Tesla said in a letter it had taken steps to ensure driver engagement with the system and enhance its safety features.

The electric carmaker introduced new warnings for red lights and stop signs last year “to minimise the potential risk of red light- or stop signrunnin­g as a result of temporary driver inattentio­n,” Tesla said in the letter.

Senator Edward Markey said he believes the potential dangers of Autopilot can be overcome. But he called for “rebranding and remarketin­g the system to reduce misuse, as well as building back-up driver monitoring tools that will make sure no-one falls asleep at the wheel”.

Autopilot has been engaged in at least three Tesla vehicles involved in fatal US crashes since 2016.

Crashes involving Autopilot have raised questions about the driver-assistance system’s ability to detect hazards, especially stationary objects.

There are mounting safety concerns globally about systems that can perform driving tasks for extended stretches of time with little or no human interventi­on, but which cannot completely replace human drivers.

Markey cited videos of Tesla drivers who appeared to fall asleep behind the wheel while using Autopilot, and others in which drivers said they could defeat safeguards by sticking a banana or water bottle in the steering wheel to make it appear they were in control of the car.

Tesla, in its letter, said its revisions to steering wheel monitoring meant that in most situations “a limp hand on the wheel from a sleepy driver will not work, nor will the coarse hand pressure of a person with impaired motor controls, such as a drunk driver”.

The company also said devices “marketed to trick Autopilot may be able to trick the system for a short time, but generally not for an entire trip before Autopilot disengages.”

Tesla said while videos such as those cited by Markey showed “a few bad actors who are grossly abusing Autopilot”, they represente­d only “a very small percentage of our customer base”.

Earlier this month, the US National Highway Traffic Safety Administra­tion (NHTSA) said it was launching an investigat­ion into a 14th crash involving Tesla in which it suspects Autopilot was in use. NHTSA is probing a December 29 fatal crash of a Model S Tesla in California. In that incident, the car exited the 91 Freeway, ran a red light and struck a 2006 Honda Civic, killing its two occupants.

MARKEY CITED VIDEOS OF TESLA DRIVERS WHO APPEARED TO FALL ASLEEP BEHIND THE WHEEL WHILE USING AUTOPILOT

 ??  ?? Autopilot has been engaged in at least three Tesla vehicles involved in fatal US crashes since 2016.
Autopilot has been engaged in at least three Tesla vehicles involved in fatal US crashes since 2016.

Newspapers in English

Newspapers from South Africa