Pittsburgh Post-Gazette

The problem with self-driving cars? Many don’t drive themselves

-

The National Highway Traffic Safety Administra­tion released a report this month on crashes involving vehicles with automated technology. Self-driving cars may not really be the problem — the problem is cars that don’t drive themselves but manage to convince the drivers that they do.

The report includes data collected over a 10-month period following an order last summer that required automakers to report incidents that included cars with advanced driver-assistance systems. Fully autonomous vehicles such as Google spinoff Waymo or General Motors-controlled Cruise LLC ended up in 130 crashes, most of them occurring when the car was struck from behind, 108 of which resulted in no injuries and only one of which resulted in a serious injury. Meanwhile, cars with partially automated systems experience­d nearly 400 crashes. (NHTSA did not provide the total number of hours or miles driven.) Six people died and five were seriously injured. A previous crash in a Tesla Model S ended in a fire that took four hours and more than 30,000 gallons of water to put out.

The study is a reminder not only that the fully self-driving future many people imagine is a long way off, but also that a present in which cars can perform on their own some functions traditiona­lly reserved for humans can prove dangerous. The NHTSA also recently upgraded a probe of Tesla Autopilot to an engineerin­g analysis; investigat­ors are examining the feature’s responsibi­lity for repeated collisions with parked emergency vehicles such as ambulances and police cruisers — which drivers should have been able to see about eight seconds before impact, but which they took no action to avoid until two to five seconds before impact.

The issue, it appears, may not be merely that automated systems themselves have flaws but also that drivers are relying too heavily on systems that aren’t designed to do all the work without human input. After all, when something is called “full self-driving” it’s easy to expect, consciousl­y or subconscio­usly, that it will fully drive itself. Even when software supposedly requires drivers to pay attention, the fact that a car can take care of some things can lull people into thinking the car will take care of all things — or into relaxing more generally, so that if something does go wrong they are unprepared to respond. This is what the NHTSA means when it says it will examine whether Tesla Autopilot “may exacerbate human factors or behavioral safety risks.”

So far, there’s no data to show whether partial automation features render driving safer or less safe. The NHTSA could certainly try to make the former more likely by imposing minimum performanc­e standards in addition to restrictio­ns on terminolog­y that exaggerate­s a vehicle’s capabiliti­es. But drivers themselves would do well to remember that the era of self-driving cars for the most part hasn’t yet begun — even when they’re at the wheel of a vehicle that does some of the work for them.

Newspapers in English

Newspapers from United States