Inquiry focuses on Tesla system
Crashes involving Autopilot at issue
DETROIT — The U.S. government has opened a formal investigation into Tesla’s partially automated driving system Autopilot after a series of crashes with parked emergency vehicles.
The investigation covers 765,000 vehicles, comprising almost everything Tesla has sold in the U.S. since the start of the 2014 model year. In the crashes identified by the National Highway Traffic Safety Administration as part of the investigation, 17 people were injured and one was killed.
The agency said it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, illuminated arrow boards or cones warning of hazards. The agency announced the action Monday in a post on its website.
The investigation is another sign that the agency under President Joe Biden is taking a tougher stance on automated-vehicle safety than it has under previous administrations. Previously, the agency was reluctant to regulate the new technology for fear of hampering adoption of the potentially life-saving
systems.
The investigation covers Tesla’s entire model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.
The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that the traffic safety agency and Tesla limit Autopilot’s use to areas where it can safely operate. The board also recommended that the agency require Tesla to have a better system to make sure drivers are paying attention.
The agency has not taken action on any of the recommendations. The transportation board has no enforcement powers and can only make recommendations to other federal agencies.
Last year, the board blamed Tesla, drivers and lax regulation by the safety agency for two collisions in which Teslas ran beneath crossing tractortrailers. The transportation board took the unusual step of accusing the safety agency of contributing to the crashes by failing to make sure automakers put safeguards in place to limit use of electronic driving systems.
The board made the determinations after investigating a 2019 crash in Delray Beach, Fla., in which the 50-year-old driver of a Tesla Model 3 was killed. The car was on Autopilot, and neither the driver nor the Autopilot system braked or tried to avoid the tractortrailer crossing the Tesla vehicle’s path.
DRIVERS’ ROLE
“We are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths,” said Jason Levine, executive director of the nonprofit Center for Auto Safety, an advocacy group. “If anything, this probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.”
Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or, in one case, even riding in the back seat while a car rolled down a California highway.
A message was left early Monday seeking comment from Tesla, which has disbanded its media relations office.
The safety agency has sent investigative teams to 31 crashes involving partially automated driver-assist systems since June 2016. Such systems can keep a vehicle centered in its lane and a safe distance from vehicles in front of it. Of those crashes, 25 involved Tesla Autopilot, and 10 deaths were reported, according to data released by the agency.
Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to crossing tractor-trailers, Teslas using Autopilot have crashed into stopped emergency vehicles and a roadway barrier.
The investigation by the safety agency is long overdue, said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles.
HANDS ON WHEEL
One critical issue of focus for investigators is how Autopilot ensures that Tesla drivers are paying attention to the road. The company’s owner manuals instruct drivers to keep their hands on the steering wheel, but the system continues operating even if drivers only occasionally tap the wheel.
“It’s very easy to bypass the steering-pressure thing,” Rajkumar said. “It’s been going on since 2014. We have been discussing this for a long time now.”
General Motors has a similar system, called Super Cruise, that allows drivers to take their hands off the steering wheel but uses an infrared camera to monitor drivers’ eyes to ensure that they are looking at the road.
The crashes into emergency vehicles cited by the safety agency began Jan. 22, 2018, in Culver City, Calif., when a Tesla using Autopilot struck a parked firetruck that was partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.
Since then, the agency said, there have been crashes in Laguna Beach, Calif.; Norwalk, Conn.; Cloverdale, Ind.; West Bridgewater, Mass.; Cochise County, Ariz.; Charlotte, N.C.; Montgomery County, Texas; Lansing, Mich.; and Miami.
An investigation could lead to a recall or other enforcement action by the agency.
“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said in a statement. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”