Santa Fe New Mexican

Investigat­ion looks into Tesla’s Autopilot system after multiple collisions

- By Cade Metz and Neal E. Boudette

Elon Musk built his electric car company, Tesla, around the promise that it represente­d the future of driving — a phrase emblazoned on the automaker’s website.

Much of that promise was centered on Autopilot, a system of features that could steer, brake and accelerate the company’s sleek electric vehicles on highways. Over and over, Musk declared that truly autonomous driving was nearly at hand — the day when a Tesla could drive itself — and that the capability would be whisked to drivers over the air in software updates.

Unlike technologi­sts at almost every other company working on self-driving vehicles, Musk insisted that autonomy could be achieved solely with cameras tracking their surroundin­gs. But many Tesla engineers questioned whether it was safe enough to rely on cameras without the benefit of other sensing devices — and whether Musk was promising drivers too much about Autopilot’s capabiliti­es.

Now those questions are at the heart of an investigat­ion by the National Highway Traffic Safety Administra­tion after at least 12 accidents in which Teslas using Autopilot drove into parked firetrucks, police cars and other emergency vehicles, killing one person and injuring 17 others.

Families are suing Tesla over fatal crashes, and Tesla customers are suing the company for misreprese­nting Autopilot and a set of sister services called Full Self Driving, or FSD.

As the guiding force behind Autopilot, Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the past decade show. Musk repeatedly misled buyers about the services’ abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliatio­n from Musk and Tesla.

Musk and a top Tesla lawyer did not respond to multiple email requests for comment for this article over several weeks, including a detailed list of questions. But the company has consistent­ly said that the onus is on drivers to stay alert and take control of their cars should Autopilot malfunctio­n.

Regulators have warned that Tesla and Musk have exaggerate­d the sophistica­tion of Autopilot, encouragin­g some people to misuse it.

“Where I get concerned is the language that’s used to describe the capabiliti­es of the vehicle,” said Jennifer Homendy, chairwoman of the National Transporta­tion Safety Board, which has investigat­ed accidents involving Autopilot and criticized the system’s design. “It can be very dangerous.”

Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporat­ed radar, and for a time, Tesla worked on developing its own radar technology. But three people who worked on the project said Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.

In May, Musk said on Twitter that Tesla was no longer putting radar on new cars. He said the company had tested the safety implicatio­ns of not using radar but provided no details.

Some people have applauded Musk, saying a certain amount of compromise and risk was justified as he strove to reach mass production and ultimately change the automobile industry.

But recently, even Musk has expressed some doubts about Tesla’s technology. After repeatedly describing FSD in speeches, in interviews and on social media as a system on the verge of full autonomy, Musk in August called it “not great.” The team working on it, he said on Twitter, “is rallying to improve as fast as possible.”

Tesla began developing Autopilot more than seven years ago as an effort to meet new safety standards in Europe, which required technology such as automatic braking, according to three people familiar with the origins of the project.

The company originally called this an “advanced driver assistance” project but was soon exploring a new name. Executives led by Musk decided on “Autopilot,” although some Tesla engineers objected to the name as misleading, favoring “Copilot” and other options, these three people said.

The name was borrowed from the aviation systems that allow planes to fly themselves in ideal conditions with limited pilot input.

At Autopilot’s official announceme­nt in October 2014, Tesla said that the system would brake automatica­lly and keep the car in a lane but added that “the driver is still responsibl­e for, and ultimately in control of, the car.” It said that self-driving cars were “still years away from becoming a reality.”

At the beginning, Autopilot used cameras, radar and soundwave sensors. But Musk told engineers that the system should eventually be able to drive autonomous­ly from door to door — and it should do so solely with cameras, according to three people who worked on the project.

They said the Autopilot team continued to develop the system using radar and even planned to expand the number of radar sensors on each car, as well as exploring lidar — “light detection and ranging” devices that measure distances using laser pulses.

But Musk insisted that his two-eyes metaphor was the way forward and questioned whether radar was ultimately worth the headache and expense of buying and integratin­g radar technology from third parties, four people who worked on the Autopilot team said.

Over time, the company and the team moved closer to his way of thinking, placing more emphasis on camera technology, these people said.

In early November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of new FSD features, after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system.

Newspapers in English

Newspapers from United States