East Bay Times

No one knows how safe autonomous systems are

- By Cade Metz

Every three months, Tesla publishes a safety report that provides the number of miles between crashes when drivers use the company's driver-assistance system, Autopilot, and the number of miles between crashes when they do not.

These figures always show that accidents are less frequent with Autopilot, a collection of technologi­es that can steer, brake and accelerate Tesla vehicles on its own.

But the numbers are misleading. Autopilot is used mainly for highway driving, which is generally twice as safe as driving on city streets, according to the Department of Transporta­tion. Fewer crashes may occur with Autopilot merely because it is typically used in safer situations.

Tesla has not provided data that would allow a comparison of Autopilot's safety on the same kinds of roads. Neither have other carmakers that offer similar systems.

Autopilot has been on public roads since 2015. General Motors introduced Super Cruise in 2017, and Ford Motor brought out BlueCruise last year. But publicly available data that reliably measures the safety of these technologi­es is scant. American drivers — whether using these systems or sharing the road with them — are effectivel­y guinea pigs in an experiment whose results have not yet been revealed.

Carmakers and tech companies are adding more vehicle features that they claim improve safety, but it is difficult to verify these claims. All the while, fatalities on the country's highways and streets have been climbing in recent years, reaching a 16-year high in 2021. It would seem that any additional safety provided by technologi­cal advances is not offsetting poor decisions by drivers behind the wheel.

“There is a lack of data that would give the public the confidence that these systems, as deployed, live up to their expected safety benefits,” said J. Christian Gerdes, a professor of mechanical engineerin­g and co-director of Stanford University's Center for Automotive Research who was the first chief innovation officer for the Department of Transporta­tion.

GM collaborat­ed with the University of Michigan on a study that explored the potential safety benefits of Super Cruise but concluded that they did not have enough data to understand whether the system reduced crashes.

A year ago, the National Highway Traffic Safety Administra­tion, the government's auto safety regulator, ordered companies to report potentiall­y serious crashes involving advanced driver-assistance systems along the lines of Autopilot within a day of learning about them. The order said the agency would make the reports public, but it has not yet done so.

The safety agency declined to comment on what informatio­n it had collected but said in a statement that the data would be released “in the near future.”

Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it had reported two incidents involving Super Cruise to NHTSA: one in 2018 and one in 2020. Ford declined to comment.

The agency's data is unlikely to provide a complete picture of the situation, but it could encourage lawmakers and drivers

to take a much closer look at these technologi­es and ultimately change the way they are marketed and regulated.

“To solve a problem, you first have to understand it,” said Bryant Walker Smith, an associate professor at the University of South Carolina's law and engineerin­g schools who specialize­s in emerging transporta­tion technologi­es. “This is a way of getting more ground truth as a basis for investigat­ions, regulation­s and other actions.”

Despite its abilities, Autopilot does not remove responsibi­lity from the driver. Tesla tells drivers to stay alert and be ready to take control of the car at all times. The same is true of BlueCruise and Super Cruise.

But many experts worry that these systems, because they enable drivers to relinquish active control of the car, may lull them into thinking that their cars are driving themselves. Then, when the technology malfunctio­ns or cannot handle a situation on its own, drivers may be unprepared to take control as quickly as needed.

Older technologi­es, such as automatic emergency braking and lane departure warning, have long provided

safety nets for drivers by slowing or stopping the car or warning drivers when they drift out of their lane. But newer driver-assistance systems flip that arrangemen­t by making the driver the safety net for technology.

Safety experts are particular­ly concerned about Autopilot because of the way it is marketed. For years, Musk has said the company's cars were on the verge of true autonomy — driving themselves in practicall­y any situation. The system's name also implies automation that the technology has not yet achieved.

This may lead to driver complacenc­y. Autopilot has played a role in many fatal crashes, in some cases because drivers were not prepared to take control of the car.

Musk has long promoted Autopilot as a way of improving safety, and Tesla's quarterly safety reports seem to back him up. But a recent study from the Virginia Transporta­tion Research Council, an arm of the Virginia Department of Transporta­tion, shows that these reports are not what they seem.

“We know cars using Autopilot are crashing less often than when Autopilot is not used,” said Noah Goodall,

a researcher at the council who explores safety and operationa­l issues surroundin­g autonomous vehicles. “But are they being driven in the same way, on the same roads, at the same time of day, by the same drivers?”

Analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organizati­on funded by the insurance industry, has found that older technologi­es such as automatic emergency braking and lane departure warning have improved safety. But the organizati­on says studies have not yet shown that driver-assistance systems provide similar benefits.

Part of the problem is that police and insurance data do not always indicate whether these systems were in use at the time of a crash.

The federal auto safety agency has ordered companies to provide data on crashes when driver-assistance technologi­es were in use within 30 seconds of impact. This could provide a broader picture of how these systems are performing.

But even with that data, safety experts said, it will be difficult to determine whether using these systems is safer than turning them off in the same situations.

The Alliance for Automotive Innovation, a trade group for car companies, has warned that the federal safety agency's data could be misconstru­ed or misreprese­nted. Some independen­t experts express similar concerns.

“My big worry is that we will have detailed data on crashes involving these technologi­es, without comparable data on crashes involving convention­al cars,” said Matthew Wansley, a professor at the Cardozo

School of Law in New York who specialize­s in emerging automotive technologi­es and was previously general counsel at an autonomous vehicle startup called nuTonomy. “It could potentiall­y look like these systems are a lot less safe than they really are.”

For this and other reasons, carmakers may be reluctant to share some data with the agency. Under its order, companies can ask it to withhold certain data by claiming it would reveal business secrets.

The agency is also collecting crash data on automated driving systems — more advanced technologi­es that aim to completely remove drivers from cars. These systems are often referred to as “self-driving cars.”

For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel as a backup. Waymo, a company owned by Google's parent, Alphabet, operates a service without drivers in the suburbs of Phoenix, and similar services are planned in cities like San Francisco and Miami.

Companies are already required to report crashes involving automated driving systems in some states. The federal safety agency's data, which will cover the whole country, should provide additional insight in this area.

But the more immediate concern is the safety of Autopilot and other driver-assistance systems, which are installed on hundreds of thousands of vehicles.

“There is an open question: Is Autopilot increasing crash frequency or decreasing it?” Wansley said. “We might not get a complete answer, but we will get some useful informatio­n.”

 ?? ROGER KISBY — THE NEW YORK TIMES ?? Automakers and technology companies say they are making driving safer, but verifying these claims is difficult.
ROGER KISBY — THE NEW YORK TIMES Automakers and technology companies say they are making driving safer, but verifying these claims is difficult.

Newspapers in English

Newspapers from United States