Calgary Herald

Safety board investigat­ing Tesla freeway collision

Autopilot system likely to face scrutiny as government­s grapple with regulation

- TOM KRISHER AND DEE-ANN DURBIN

The U.S. National Transporta­tion Safety Board is investigat­ing the California crash of a Tesla Model S electric car that may have been operating under its semi-autonomous “Autopilot” system.

It’s the second time the board has looked into a Tesla crash, and likely means that it wants informatio­n about whether Autopilot was on and if its sensors somehow failed to see a stopped fire truck Monday on Interstate 405 in Culver City, Calif., near Los Angeles.

The board sent two investigat­ors to Culver City, NTSB spokesman Peter Knudson said Wednesday. The NTSB said on Twitter that investigat­ors will focus on driver and vehicle factors.

The NTSB in September determined that design limitation­s of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida involving a vehicle operating under Autopilot. But it blamed the crash on an inattentiv­e Tesla driver’s overrelian­ce on technology and a truck driver who made a left turn in front of the car.

The California investigat­ion comes as Congress and federal agencies grapple with how to regulate autonomous vehicles and those with systems that are partially self-driving. The systems can significan­tly reduce crashes, but computer-driven vehicles also can make mistakes.

Tesla wouldn’t say if Autopilot was working at the time of the Culver City crash, but said in a statement Monday that drivers must stay attentive when it’s in use. The company would not comment on the investigat­ion.

In Monday’s crash, the California Highway Patrol and the Culver City Fire Department confirmed the southbound Tesla hit the rear of the fire truck but could not confirm if Autopilot was operating, The Mercury News of San Jose reported.

The fire truck was parked in the left emergency and carpool lane with emergency lights flashing to block a previous crash on Monday morning, Culver City Fire Battalion Chief Ken Powell told the newspaper. The Tesla suffered significan­t damage, but the driver showed no significan­t injuries, Powell said.

The Model S Autopilot is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomous­ly in nearly all circumstan­ces. Level 2 automation systems are generally limited to use on interstate highways, which don’t have intersecti­ons. Autopilot keeps a vehicle centred in its lane at a set distance from cars in front of it. It also can change lanes and brake automatica­lly. Drivers are supposed to continuous­ly monitor vehicle performanc­e and be ready to take control if necessary.

In the Florida crash, which killed an Ohio man driving the Tesla, NTSB investigat­ors found that the sedan’s cameras and radar weren’t capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles being followed to prevent rear-end collisions. The NTSB re-issued previous recommenda­tions that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles’ location, speed, heading and other informatio­n to other vehicles in order to prevent collisions.

The NTSB also recommende­d that automakers develop systems to make sure drivers pay attention while using semi-autonomous systems, other than detecting the pressure of hands on the steering wheel. The driver in the Florida crash had his hands on the sedan’s steering wheel for only 25 seconds out of the 37.5 minutes the vehicle’s cruise control and lanekeepin­g systems were in use prior to the crash, investigat­ors found.

Tesla has taken steps to prevent drivers from using Autopilot improperly, including measuring the amount of torque applied to the steering wheel and sending visual and audio warnings. If the warnings are ignored, drivers would be prevented from using Autopilot, the company has said.

The National Highway Traffic Safety Administra­tion, which regulates auto safety, declined last year to issue a recall or fine Tesla as a result of the crash, but it warned automakers they aren’t to treat semi-autonomous cars as if they were fully self-driving.

 ?? SPENCER PLATT/GETTY IMAGES ?? The U.S. safety board in September found that design limitation­s of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida. A similar model crashed Monday.
SPENCER PLATT/GETTY IMAGES The U.S. safety board in September found that design limitation­s of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida. A similar model crashed Monday.

Newspapers in English

Newspapers from Canada