Baltimore Sun

Carmakers’ autonomous tech warrants limits, engineer says

- By Cade Metz

Last fall, Missy Cummings sent a document to her colleagues at the National Highway Traffic Safety Administra­tion that revealed a surprising trend: When people using advanced driver-assistance systems die or are injured in a car crash, they are more likely to have been speeding than people driving cars on their own.

The two-page analysis of nearly 400 crashes involving systems such as Tesla’s Autopilot and General Motors’ Super Cruise is far from conclusive. But it raises fresh questions about the technologi­es that have been installed in hundreds of thousands of cars on U.S. roads. Cummings said the data indicated that drivers were becoming too confident in the systems’ abilities and that automakers and regulators should restrict when and how the technology was used.

People “are overtrusti­ng the technology,” she said. “They are letting the cars speed. And they are getting into accidents that are seriously injuring them or killing them.”

Cummings, an engineerin­g and computer science professor at George Mason University who specialize­s in autonomous systems, recently returned to academia after more than a year at the safety agency. On Wednesday, she presented some of her findings at the University of Michigan, a short drive from Detroit, the main hub of the U.S. auto industry.

Systems such as Autopilot and Super Cruise, which can steer, brake and accelerate vehicles on their own, are becoming increasing­ly common as automakers compete to win over car buyers with promises of superior technology. Companies sometimes market these systems as if they made cars autonomous. But their legal fine print requires drivers to stay alert and be ready to take control of the vehicle at any time.

In interviews last week, Cummings said automakers and regulators ought to prevent such systems from operating over the speed limit and require drivers using them to keep their hands on the steering wheel and eyes on the road.

“Car companies — meaning Tesla and others — are marketing this as a handsfree technology,” she said. “That is a nightmare.”

But these are not measures that NHTSA can easily put in place. Any effort to rein in how driver-assistance systems are used will probably be met with criticism and lawsuits from the auto industry, especially from Tesla and its CEO, Elon Musk, who has long chafed at rules he considers antiquated.

Neverthele­ss, on Thursday, Tesla said it is recalling nearly 363,000 vehicles with its “Full Self-Driving” system to fix problems with the way it behaves around intersecti­ons and following posted speed limits.

The NHTSA said in documents posted Thursday that Tesla will fix the concerns with an online software update in the coming weeks. The documents say Tesla is doing the recall but does not agree with an agency analysis of the problem.

Cummings acknowledg­ed that putting the rules she was calling for into effect would be difficult.

But Cummings, 56, one of the first female fighter pilots in the Navy, said she felt compelled to speak out because “the technology is being abused by humans.”

“We need to put in regulation­s that deal with this,” she said.

The safety agency and Tesla did not respond to requests for comment. GM pointed to studies that it had conducted with the University of Michigan that examined the safety of its technology.

Because Autopilot and other similar systems allow drivers to relinquish active control of the car, many safety experts worry that the technology will lull people into believing the cars are driving themselves. When the technology malfunctio­ns or cannot handle situations such as having to veer quickly to miss stalled vehicles, drivers may be unprepared to take control quickly enough.

 ?? NEW YORK TIMES 2019 THE ?? A driver uses GM’s Super Cruise mode, which can steer the car on its own.
NEW YORK TIMES 2019 THE A driver uses GM’s Super Cruise mode, which can steer the car on its own.

Newspapers in English

Newspapers from United States