Bangkok Post

Tesla Autopilot probe in US looks at securities, wire fraud

- MIKE SPECTOR CHRIS PRENTICE

NEW YORK: US prosecutor­s are examining whether Tesla committed securities or wire fraud by misleading investors and consumers about its electric vehicles’ self-driving capabiliti­es, three people familiar with the matter told Reuters.

Tesla’s Autopilot and Full Self-Driving (FSD) systems assist with steering, braking and lane changes, but are not fully autonomous. While Tesla has warned drivers to stay ready to take over driving, the Justice Department is examining other statements by Tesla and Chief Executive Elon Musk suggesting its cars can drive themselves.

US regulators have separately investigat­ed hundreds of crashes, including fatal ones, that have occurred in Teslas with Autopilot engaged, resulting in a mass recall by the automaker.

Reuters exclusivel­y reported the US criminal investigat­ion into Tesla in October 2022, and is now the first to report the specific criminal liability federal prosecutor­s are examining.

Investigat­ors are exploring whether Tesla committed wire fraud, which involves deception in interstate communicat­ions, by misleading consumers about its driver-assistance systems, the sources said. They are also examining whether Tesla committed securities fraud by deceiving investors, two of the sources said.

The Securities and Exchange Commission is also investigat­ing Tesla’s representa­tions about driver-assistance systems to investors, one of the people said. The SEC declined to comment.

Tesla did not respond to a request for comment. Last October, it disclosed in a filing that the Justice Department had asked the company for informatio­n about Autopilot and Full Self-Driving.

The Justice Department declined to comment.

The probe, which is not evidence of wrongdoing, could result in criminal charges, civil sanctions, or no action. Prosecutor­s are far from deciding how to proceed, one of the sources said, in part because they are sifting through voluminous documents Tesla provided in response to subpoenas.

Reuters could not determine the specific statements prosecutor­s are reviewing as potentiall­y illegal. Musk has aggressive­ly touted the prowess of Tesla’s driver-assistance technology for nearly a decade.

Tesla videos demonstrat­ing the technology that remain archived on its website say: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

‘FAILURE IS NOT FRAUD’

A Tesla engineer testified in 2022 in a lawsuit over a fatal crash involving Autopilot that one of the videos, posted in October 2016, intended to show the technology’s potential and did not accurately portray its capabiliti­es at the time. Musk neverthele­ss posted the video on social media, writing: “Tesla drives itself (no human input at all) thru urban streets to highway streets, then finds a parking spot.”

In a conference call with reporters in 2016, Musk described Autopilot as “probably better” than a human driver. During an October 2022 call, Musk addressed a forthcomin­g FSD upgrade he said would allow customers to travel “to your work, your friend’s house, to the grocery store without you touching the wheel.”

Musk is increasing­ly focused on self-driving technology as Tesla’s car sales and profit slump. Tesla recently slashed costs through mass layoffs and shelved plans for a long-awaited $25,000 model that had been expected to drive sales growth.

“Going balls to the wall for autonomy is a blindingly obvious move,” the billionair­e executive posted on his socialmedi­a platform X in mid-April. Tesla shares, down more than 28% so far this year, surged in late April when Musk visited China and made progress toward approvals to sell FSD there.

Musk has repeatedly promised selfdrivin­g Teslas for about a decade. “Mere failure to realise a long-term, aspiration­al goal is not fraud,” Tesla lawyers said in a 2022 court filing.

Prosecutor­s scrutinisi­ng Tesla’s autonomous-car claims are proceeding with caution, recognisin­g the legal hurdles they face, the people familiar with the inquiry said.

They will need to demonstrat­e that Tesla’s claims crossed a line from legal salesmansh­ip to material and knowingly false statements that unlawfully harmed consumers or investors, three legal experts uninvolved in the probe told Reuters.

US courts previously have ruled that “puffery” or “corporate optimism” regarding product claims do not amount to fraud. In 2008, a federal appeals court ruled that statements of corporate optimism alone do not demonstrat­e that a company official intentiona­lly misled investors.

Justice Department officials will likely seek internal Tesla communicat­ions as evidence that Musk or others knew they were making false statements, said Daniel Richman, a Columbia Law School professor and former federal prosecutor. That is a challenge, Richman said, but the safety risk involved in oversellin­g self-driving systems also “speaks to the seriousnes­s with which prosecutor­s, a judge and jury would take the statements.”

FATAL CRASHES

Tesla’s claims about Autopilot and FSD have also drawn scrutiny in regulatory investigat­ions and lawsuits.

Safety regulators and courts have raised concerns in recent months that corporate messaging about the technology — including the brand names Autopilot and Full Self-Driving — have imbued customers with a false sense of security.

In April, the Washington State Patrol arrested a man on suspicion of vehicular homicide after his Tesla, with Autopilot engaged, struck and killed a motorcycli­st while the driver looked at his phone, police records show. In a probable-cause statement, a trooper cited the driver’s “admitted inattentio­n to driving, while on autopilot mode ... putting trust in the machine to drive for him.”

In Washington state, a driver remains “responsibl­e for the safe and legal operation of that vehicle” regardless of its technologi­cal capabiliti­es, a state patrol spokespers­on told Reuters.

The same month, the US National Highway Traffic Safety Administra­tion launched an investigat­ion into whether a Tesla recall of more than 2 million vehicles i n December adequately addressed safety issues with Autopilot. NHTSA declined to comment. The recall followed a long-running probe opened by regulators after cars with Autopilot engaged repeatedly crashed into vehicles at first-responder emergency scenes.

Regulators subsequent­ly examined hundreds of crashes where Autopilot was engaged and identified 14 deaths and 54 injuries.

Tesla disputed NHTSA’s findings but agreed to the recall, which employed over-the-air software updates intended to alert inattentiv­e drivers.

The NHTSA investigat­ion found “a critical safety gap between drivers’ expectatio­ns” of Tesla’s technology “and the system’s true capabiliti­es,” according to agency records. “This gap led to foreseeabl­e misuse and avoidable crashes.”

 ?? REUTERS ?? A Tesla Model 3 vehicle drives on Autopilot along a highway in Westminste­r, California, in 2022.
REUTERS A Tesla Model 3 vehicle drives on Autopilot along a highway in Westminste­r, California, in 2022.

Newspapers in English

Newspapers from Thailand