The Hamilton Spectator

Autopilot data sought from Tesla in crash probe

Car’s auto driving system failed to detect a tractor-trailer

- TOM KRISHER

DETROIT — Federal investigat­ors looking into electric carmaker Tesla Motors’ Autopilot system after a fatal crash in Florida are zeroing in on the limitation­s of the system and how it reacts when obstacles cross its path.

The National Highway Traffic Safety Administra­tion on Tuesday posted a nine-page letter seeking informatio­n from Tesla about Autopilot and why it failed to detect a tractor-trailer that crossed in front of a Model S sedan May 7 in Williston, Fla.

Much of the letter seeks informatio­n on how the system works at intersecti­ons with crossing traffic, but it also asks Tesla to describe how the system detects “compromise­d or degraded” signals from cameras and other sensors and how such problems are communicat­ed to drivers.

The crash in Williston killed former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla, which collects data from its cars via the Internet, says the cameras on Brown’s Model S sedan failed to distinguis­h the white side of a turning tractor-trailer from a brightly lit sky and the car didn’t automatica­lly brake.

The safety agency also asked Tesla for its reconstruc­tion of the Brown crash, and for details of all known crashes, consumer complaints and lawsuits filed or settled because the Autopilot system didn’t brake as expected.

NHTSA said Tesla must comply with its request by Aug. 26 or face penalties of up to $21,000 per day.

Tesla’s Autopilot system uses cameras, radar and computers to detect objects and automatica­lly brake if the car is about to hit something. It also can steer the car to keep it centred in its lane. The company says that before Autopilot can be used, drivers must acknowledg­e that it’s an “assist feature” that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.

Tesla released Autopilot last fall. Some safety advocates have questioned whether the company — which says the system is still in “beta” phase, a computer industry term for software testing by customers — and NHTSA allowed the public access to the system too soon.

“No safety-significan­t system should ever use consumers as test drivers on the highways,” said Clarence Ditlow, head of the nonprofit Center for Automotive Safety. He said NHTSA lacks the electronic engineers and laboratori­es needed to keep up with advanced technology such as General Motors airbags or Tesla’s Autopilot.

Tesla says Autopilot has been safely used in over 100 million miles of driving by customers and that data shows drivers who use Autopilot are safer than those who don’t.

Newspapers in English

Newspapers from Canada