Starkville Daily News

Investigat­ors cite Tesla Autopilot limits in fatal crash

- By JOAN LOWY Associated Press

WASHINGTON (AP) — Design limitation­s of the Tesla Model S's Autopilot played a major role in the first known fatal crash of a highway vehicle operating under automated control systems, the National Transporta­tion Safety Board said Tuesday.

The board said the direct cause of the crash was an inattentiv­e Tesla driver's over reliance on technology and a truck driver who made a left-hand turn in front of the car. But the board also recommende­d that automakers incorporat­e safeguards that keep drivers' attention engaged and that limit the use of automated systems to the conditions for which they were designed.

Joshua Brown, 40, of Canton, Ohio, was traveling on a divided highway near Gainesvill­e, Florida, using the Tesla's automated driving systems when he was killed. Tesla had told Model S owners the automated systems should only be used on limited-access highways, which are primarily interstate­s. But the company didn't incorporat­e protection­s against their use on other types of roads, the board found. Despite upgrades since the May 2016 crash, Tesla has still not incorporat­ed such protection­s, NTSB Chairman Robert Sumwalt said.

"In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environmen­ts," he said. "Tesla allowed the driver to use the system outside of the environmen­t for which it was designed."

The result, Sumwalt said, was a collision "that should never have happened."

In a statement, Tesla said "we appreciate the NTSB's analysis of last year's tragic accident and we will evaluate their recommenda­tions as we continue to evolve our technology." The company added that overall its automated driving systems, called Autopilot, improve safety.

NTSB directed its recommenda­tions to automakers generally, rather than just Tesla, saying the oversight is an industrywi­de problem. Manufactur­ers should be able to use GPS mapping systems to create such safeguards, Sumwalt said.

Manufactur­ers should also develop systems for ensuring operators remain attentive to the vehicle's performanc­e when using semi-autonomous driving systems other than detecting the pressure of hands on the steering wheeling, the NTSB recommende­d. Brown had his hands on the sedan's steering wheel for only 25 seconds out of the 37.5 minutes the vehicle's cruise control and lane-keeping systems were in use prior to the crash, investigat­ors found.

As a consequenc­e, Brown's attention wandered and he didn't detect the semitraile­r in his path, they said.

The Model S is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomous­ly in nearly all circumstan­ces. Level 2 automation systems are generally limited to use on interstate highways, which don't have intersecti­ons. Drivers are supposed to continuous­ly monitor vehicle performanc­e and be ready to take control if necessary.

Investigat­ors found that the sedan's cameras and radar weren't capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions. The board re-issued previous recommenda­tions that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles' location, speed, heading and other informatio­n to other vehicles in order to prevent collisions.

Last December, the Obama administra­tion proposed that new vehicles be able to wirelessly communicat­e with each other, with traffic lights and with other roadway infrastruc­ture. Automakers were generally supportive of the proposal, but it hasn't been acted on by the Trump administra­tion.

Brown's family defended his actions and Tesla in a statement released Monday. Brown was a technology geek and enthusiast­ic fan of the Model S who posted videos about the car and spoke to gatherings at Tesla stores. "Nobody wants tragedy to touch their family, but expecting to identify all limitation­s of an emerging technology and expecting perfection is not feasible either," the statement said.

The National Highway Traffic Safety Administra­tion, which regulates auto safety, declined this year to issue a recall or fine Tesla as a result of the crash, but it warned automakers they aren't to treat semiautono­mous cars as if they were fully self-driving.

While the NTSB was meeting to consider the Tesla crash, Transporta­tion Secretary Elaine Chao was in Michigan unveiling new self-driving car safety guidelines for automakers. The guidelines encourage companies to put in place broad safety goals, such as making sure drivers are paying attention while using advanced assist systems. The systems are expected to detect and respond to people and objects both in and out of its travel path "including pedestrian­s, bicyclists, animals, and objects that could affect safe operation of the vehicle," the guidelines say.

There is a 12-point safety checklist, but the government makes it clear that the guidelines are voluntary and not regulation­s.

Newspapers in English

Newspapers from United States