In fatal 2018 crash, Tesla’s Autopilot followed lane lines
In Tesla’s marketing materials, the company’s Autopilot driver-assistance system is cast as a technological marvel that uses “advanced cameras, sensors and computing power” to steer, accelerate and brake automatically — even change lanes so “you don’t get stuck behind slow cars or trucks.”
Under oath, however, Tesla engineer Akshay Phatak last year described the software as fairly basic in at least one respect: the way it steers on its own.
“If there are clearly marked lane lines, the system will follow the lane lines,” Phatak said under questioning in July. Tesla’s groundbreaking system, he said, was simply “designed” to follow painted lane lines.
Phatak’s testimony, which was obtained by The Washington Post, came in a deposition for a wrongful-death lawsuit set for trial Tuesday. The case involves a fatal crash in March 2018, when a Tesla in Autopilot mode struck a highway barrier near Mountain View, Calif., after getting confused by what the company’s lawyers described in court documents as a “faded and nearly obliterated” lane line.
The driver, Walter Huang, 38, was killed. An investigation by the National Transportation Safety Board cited Tesla’s failure to limit the use of Autopilot in such conditions as a contributing factor: The company has acknowledged to NTSB officials Autopilot is designed for areas with “clear lane markings.”
Phatak’s testimony marks the first time Tesla has publicly explained those design decisions, peeling back the curtain on a system shrouded in secrecy by the company and CEO Elon Musk. Musk, Phatak and Tesla did not respond to requests for comment.
Following lane lines is not unique to Tesla: Many modern cars use technology to alert drivers when they’re drifting. But by marketing the technology as “Autopilot,” Tesla may be misleading drivers about the cars’ capabilities
— a central allegation in numerous lawsuits headed for trial this year and a key concern of federal safety officials.