Auto-navigate needs work
A series of fatal crashes highlights issues with Tesla’s most promising feature
We are all Tesla’s beta testers. Several of Tesla’s vaunted automated navigation features are labeled on its cars’ touch screens and owner’s manuals as being in “beta” phase, including “Autosteer,” “Navigate on Autopilot” and “Traffic Aware Cruise Control.”
In other words, Tesla — the focus of three deadly crashes in South Florida — remains a work in progress.
Despite their premium pricing, the speedy, high-tech cars are a hot commodity and growing rapidly within their home state of California and beyond. Sales tripled in Florida and nearly quadrupled in the U.S. between 2017 and 2018 as the company introduced more affordable models and ramped up production.
Critics rave about the cars’ power and ease of use and forgive growing pains that have earned the brand low reliability scores
from Consumer Reports. Yet, Teslas also consistently earn five-star safety ratings in crash tests by the National Highway Traffic Safety Administration and five-star owner loyalty scores from Consumer Reports.
In an April 2018 review of Tesla’s newest offering, the Model 3, autoblog.com writer John Beltz Snyder wrote that the car “is proof that EVs [electric vehicles] — even the relatively affordable ones — are far more than just appliances. This is a car that stirs up emotions when driving it, and the fact that it does that well is a great thing, not just for customers, but for the entire image of clean cars.”
It’s because of that enthusiasm, and the car’s status as the vanguard of an all-electric, network-driven, automated transportation future, that so much scrutiny is paid to minor glitches and spectacular failures alike.
A car unlike any other
Formed in 2003, Tesla set out to create a company and car unlike any other. Instead of gas-powered engines or gas-and-electric hybrids, Tesla’s cars would be run by pure electric propulsion, which requires a simpler mechanical architecture and costs far less per mile to run.
Tesla modeled itself more like a software company than a car maker, charging upgrade prices for such premium features that can be activated through updates downloaded from the Internet, including Autopilot and, in the base Model S, expanded battery capacity for longer driving range between charges.
Autopilot and its connected features make Teslas semi-autonomous — meaning its sensors and cameras can read road markings and surrounding vehicles and perform functions such as changing lanes, applying brakes, and following the straight lines and curves of roads. But driver vigilance is still required to take over tasks the system can’t handle, and despite Tesla’s promise to unveil “full self driving” capability later this year, experts say a fully-autonomous Tesla — or any car — that doesn’t require its occupants to pay attention to the road is still years away.
After introducing its first car in 2008 — the highpriced, low-volume Roadster — the company entered the sub-$100,000 market in
2012 with the Model S sedan. The Model X SUV was introduced in 2015, followed by the compact and more affordable Model 3 in
Even as its appeal grows in Florida, Tesla’s market share here is still minuscule.
While new Tesla registrations nationwide increased from 46,001 in 2017 to 163,771 in 2018, sales in Florida grew at a lesser rate — from 2,866 in 2017 to 8,797 a year later, according to research firm IHS Markit. That’s fewer than one out of every hundred of the 1.3 million new vehicles registered in Florida in 2018, the firm’s data shows.
One thing Tesla owners like: The cars are sports-car fast, able to go from zero to
60 mph in around five seconds. That can lead to trouble — three-quarters of
9,000 Tesla owners in the Netherlands were fined for speeding in 2017 compared with 28 percent of gas-powered car drivers, the electric vehicle website Electrive .com reported.
South Florida crashes
In South Florida, authorities and presumably Tesla are still investigating the three fatal crashes, one involving excessive speed, and two involving flaming batteries. Whether Autopilot played a role in any of them remains unclear.
May 2018, Fort Lauderdale: Two high school students were killed and another was injured after the 2014 Model S they were in went out of control on a curve at more than 100 mph. It twice struck a concrete wall and then a light post. A witness said the car burst into flames after the second collision. Small portions of the car’s lithium ion battery A modified 2016 Tesla Model S crashed and burst into flames, killing the driver, in Davie on Feb. 24. The vehicle caught fire again the next morning from a ruptured battery.
broke apart from the vehicle, which reignited twice after the crash, investigators found.
The owners of the car had it modified so it could not travel faster than 85 mph, but the device that would limit the car’s speed was removed by an employee at a Tesla dealer without the owners’ consent, according to a lawsuit filed in January. The question remains: Why did the battery rupture and explode?
February 2019, Davie: A
2016 Model S left the road “for an unknown reason” on the afternoon of Feb. 24, swerved through three lanes of traffic, hit a median and palm tree and burst into flames, Davie police said. The driver was trapped inside and died. Witnesses said the driver was speeding, but a police report stated the car had been traveling at the
50 mph speed limit. Whether the driver was using the car’s advance driver system, or “Autopilot,” may not be known for months. Like after the Fort Lauderdale crash, that car’s battery reignited several times, despite the company’s insistence that its battery packs are 10 times less susceptible to fire than gas cars.
March 2019, west of Delray Beach: A 2018 Model 3 driven south on State Road 7 in west Delray on March 1 slid under a tractor trailer that was turning north onto the divided highway. It’s not yet known whether the Tesla’s Autopilot or automatic
emergency braking system were engaged at the time, but the crash evoked comparisons to a fatal collision between a Tesla Model S operating under Autopilot and a tractor-trailer that had pulled into the driver’s path in Williston, Fla., near Gainesville, in May 2016.
The crashes in Fort Lauderdale and west Delray sparked investigations by the National Transportation Safety Board, which investigated the 2016 Williston crash. The west Delray crash also is under investigation by NHTSA, which has authority to create safety regulations, education programs and issue recalls.
The NTSB, though better known for probing crashes involving airplanes, pipelines, ships and trains, also has authority to investigate roadway crashes and issue non-binding recommendations based on its findings. According to its website, the agency “investigates select highway crashes that can advance knowledge of broad or new safety issues.”
NTSB investigations of recent Tesla crashes involve two such areas of interest — the cars’ automated vehicle control systems and battery fires.
Two of the crashes involve cars in California reportedly operating under Autopilot. One happened in March 2018 n Mountain View when a Model X SUV left the road and crashed into a highway guardrail, killing the driver. The other involved a Model S that rearended a stopped fire truck in Culver City in January 2018.
The battery fire investigations, in addition to the one that followed the 2018 Fort Lauderdale crash, also stem from crashes in California. In June 2018, the driver of a 2014 Tesla Model S in West Hollywood safely escaped before the battery pack caught fire, engulfing the vehicle. The other fire occurred in August 2017 in Lake Forest when the driver lost control of his car and slammed into the garage of an elderly couple’s home.
NTSB spokesman Christopher O’Neil said the investigations are focused on issues connected to electric vehicles and not on Tesla or any specific manufacturer.
An investigative report on challenges that electronic vehicles’ battery fires pose for first responders could be released by early fall, O’Neil said. “I would assume there are going to be safety recommendations in that report,” he said.
Another report on performance of driver-assist technologies and what lessons are being learned from the accidents in Mountain View and Culver City could be released by the end of the year, he said. Those lessons will stem from the Tesla crashes in Mountain View and Culver City, he said. Whether findings from the west Delray crash will be included is not yet known, he said.
The NTSB’s final report on the 2016 Williston crash focused on limitations of the Autopilot system and drivers’ responsibilities to remain alert and restrict use of the system to limited access, divided highways.
While Autopilot can recognize and brake for slow, stopped and decelerating vehicles traveling ahead of a Tesla in the same lane, it is not designed to react to vehicles, such as the tractortrailer, crossing or making left turns across the Tesla’s path, the report said.
So while the collision did not result from a malfunction of Autopilot, Tesla
failed to implement safeguards to prevent the car’s driver from becoming disengaged and over-relying on Autopilot, the NTSB said.
Tesla’s owner’s manual, the NTSB report notes, recommends drivers use Autopilot on limited-access roadways, which are typically divided roads, such as interstate highways, with no intersections or crossroads and that have entry ramps that ensure all vehicles are traveling in the same direction.
But availability of Autopilot is not restricted to such roads, and so nothing stopped the driver in the Williston crash from using the system on U.S. Highway 27A, which is not a limited access road.
After the crash, Tesla modified Autopilot to alert drivers more frequently to keep their hands on the steering wheel. After three warnings, the car must be restarted to reengage Autopilot.
That wasn’t enough for the NTSB, which recommended that makers of cars with semi-autonomous driver systems limit use of such systems to “conditions for which they were designed.”
“Joe and Suzy Public,” wrote NTSB board member Christopher A. Hart in the Williston report, “may conclude from the name ‘autopilot’ that they need not pay any attention to the driving task because the autopilot is doing everything.”
Tesla has not announced any action taken to comply with the NTSB’s recommendation.
Wireless ‘black boxes’?
The company also declined to answer any questions from the South Florida Sun Sentinel about any crashes involving its vehicles, including whether any Tesla crash has been found to result from malfunction of its Autopilot or operating software.
Among those questions were whether logs of actions by drivers and the cars stream to the company’s servers at all times, giving it instant access to critical precrash information even if a car’s computer is destroyed by fire.
After the May 2018 Fort Lauderdale crash, police recovered the car’s “restraint control module,” which controls airbag deployment and records the driver’s speed and behaviors leading up to the crash.
But after the Davie crash, police determined the fire rendered the car’s data storage units unreadable, meaning the only way authorities can determine whether the driver was speeding or using Autopilot when his car left the road would be if that data was streamed to Tesla’s servers before the crash.
Tesla: Safest cars ever tested
In lieu of answering questions about safety measures it has undertaken, Tesla provided links to internal blogs and information sheets asserting that its driver assistance technology helps ensure the safety of everyone on the road.
“And because every Tesla is connected, we’re able to use the more than 10 billion miles of real-world data collected by our global fleet — of which more than 1 billion have been driven with Autopilot engaged — to constantly improve our products,” one of the blogs states.
Saying that media reports of its crashes were overblown, Tesla last fall began releasing quarterly crash frequency statistics for its cars. In the fourth quarter of 2018, just one accident was recorded for every 2.91 million miles driven with Autopilot engaged compared to one accident for every 436,000 miles for all cars nationwide, its blog says.
In crash tests dating to
2013, Tesla’s have achieved perfect 5-star ratings from NHTSA, prompting Tesla to claim on its blog that its models S, X, and 3 are the safest ever tested by the agency. NHTSA last fall disputed Tesla’s “safest-ever” claim, saying it went beyond the scope of its analyses.
Tesla’s claims about the safety of its battery compartments might raise an eyebrow among first responders to two of the South Florida crashes.
After the Fort Lauderdale crash, the company sent the South Florida Sun Sentinel a previously released statement saying its vehicles are
10 times less likely to experience a fire than a gas car.
It said when fires do occur, they are “safer” because they consist of thousands of individual cells — similar to AA batteries used in consumer electronics — divided into groups placed in separate modules with firewalls between the modules and “then again between the battery pack and the passenger compartment.”
Each module is sealed to prevent gas and heat from igniting its adjacent module, so that “in the rare circumstance a fire occurs, it spreads much more slowly than in a gas fire, allowing more time for occupants to escape the vehicle,” the statement said.
Yet in the Fort Lauderdale and Davie crashes, fires quickly engulfed the vehicles, preventing efforts to rescue their occupants. In the Davie crash, the car was engulfed in flames, preventing any attempt by rescue workers to extract the driver.
A witness to the Fort Lauderdale crash said the car immediately burst into flames after the second of two collisions with walls surrounding homes. The intensity of the fire prevented bystanders from getting within 10 feet of the vehicle to help the two teens trapped inside, the witness said.
Glitches and scares
While Tesla posts testimonials to its cars’ safety on its blog, the user forum section of the company’s website also includes anecdotes from Tesla owners recounting not-so-safe experiences while driving the car.
One described her car swerving into the left lane and back again while on Autopilot and Auto Lane Change mode. “It really scared me,” she wrote.
Another described the car’s touchscreen display, where nearly all functions are controlled, suddenly going black and rebooting while the car was in operation.
In a 2016 post, an owner described “two terrifying close calls” while using Autopilot.
In one, “we were traveling 70 mph on a straight stretch of highway and started to approach a traffic jam that was absolutely stopped.” But the car was not stopping or slowing down. “All of a sudden the alarm sounds and Autopilot disengages and [I] have to slam on the brakes to stop from crashing into the stopped cars.” The second time, the car began to turn into the left lane and almost struck another car next to it. “This thing is freaking me out,” the owner wrote.
While Tesla warns customers to be vigilant while using Autopilot, some critics question whether drivers can be realistically expected to maintain constant vigilance when giving over so much control.
In a May tweet, Benedict Evans, partner in the Venture Capital firm Andreesen Horowitz, which invests in technology, said the “2⁄3 approach” to autonomous cars followed by Tesla, “where the human isn’t driving but might have to grab the wheel AT ANY TIME, is actively dangerous and a technical dead end.”
In early March, a Florida man on vacation in Los Angeles was shocked at what he saw inside a Tesla rolling next to him on the highway at 75 mph:
In a video the man posted on Twitter, the Tesla driver was asleep.
Nearly all of the Tesla’s functions are controlled from a touch screen. One driver reported that his screen went blank and rebooted while the car was in operation.