2 federal agencies probe Tesla crash
Driver says Autopilot was on when he ran into firetruck on California freeway
DETROIT — Two federal agencies have dispatched teams to investigate the California crash of a Tesla Model S electric car that may have been operating under its semiautonomous “Autopilot” system.
It’s the second time the National Transportation Safety Board and the National Highway Traffic Safety Administration have investigated the performance of Autopilot, which keeps a vehicle centered in its lane at a set distance from cars in front of it and also can change lanes and brake automatically.
The safety board sent two investigators to Culver City on Tuesday, according to spokesman Peter Knudson, and the safety administration confirmed Wednesday that it also is dispatching a special team “to investigate the crash and assess lessons learned.”
Neither agency would comment further, but it’s likely they both will seek information about whether Autopilot was on and whether its sensors somehow failed to see a stopped firetruck Monday on Interstate 405 in Culver City near Los Angeles.
The safety board said on Twitter that investigators will focus on driver and vehicle factors.
The Tesla driver told the California Highway Patrol that he had activated Autopilot before the crash, but the Highway Patrol said in a news release that it couldn’t verify the driver’s statement at this time. The crash remains under investigation, the Highway Patrol said.
The safety board in September determined that design limitations of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida involving a vehicle operating under Autopilot. But it blamed the crash on an inattentive Tesla driver’s overreliance on technology and a truck driver who made a left turn in front of the car.
The California investigation comes as Congress and federal agencies grapple with how to regulate autonomous vehicles and those with systems that are partially selfdriving. The systems can significantly reduce crashes, but computer-driven vehicles also can make mistakes.
Tesla wouldn’t say whether Autopilot was working at the time of the Culver City crash, but said in a statement Monday that drivers must stay attentive when it’s in use. The company would not comment on the investigation.
Regarding Monday’s crash, the Highway Patrol said the southbound Tesla hit the rear of the Culver City firetruck, which was parked at an angle in the carpool lane while firefighters tended to a crash on the opposite side of the freeway. The truck was unoccupied at the time and no injuries were claimed by anyone at the crash scene, the news release said.
The Model S Autopilot is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously in nearly all circumstances. Level 2 automation systems are generally limited to use on interstate highways, which don’t have intersections. With level 2 systems, drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary.
In the Florida crash, which killed an Ohio man driving a Tesla, National Transportation Safety Board investigators found that the sedan’s cameras and radar weren’t capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles being followed to prevent rear-end collisions. The board reissued previous recommendations that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles’ location, speed, heading and other information to other vehicles in order to prevent collisions.
The safety board also recommended that automakers develop systems to make sure drivers pay attention while using semiautonomous systems, other than detecting the pressure of hands on the steering wheel. The driver in the Florida crash had his hands on the sedan’s steering wheel for only 25 seconds out of the 37.5 minutes the vehicle’s cruise control and lanekeeping systems were in use prior to the crash, investigators found.
Tesla has taken steps to prevent drivers from using Autopilot improperly, including measuring the amount of torque applied to the steering wheel and sending visual and audio warnings. If the warnings are ignored, drivers would be prevented from using Autopilot, the company has said.
The National Highway Traffic Safety Administration, which regulates auto safety, declined last year to issue a recall or fine Tesla as a result of the crash, but it warned automakers they aren’t to treat semiautonomous cars as if they were fully self-driving.