What We’re Reading
Tesla engineer defends ‘Full Self-Driving’ name at crash trial (Reuters, October 5, 2023)
In the first U.S. trial over allegations that Tesla’s Autopilot feature led to a fatal crash, a Tesla engineer refuted a suggestion by opposing counsel during testimony last week that Tesla intentionally named its driving system “Full Self-Driving” in order to deceive the public into believing it had more autonomous features than actually available. The ongoing trial, in a California state court, arises from the alleged malfunction of a Tesla Autopilot system, which caused a Model 3 to strike a tree and burst into flames after suddenly veering off a highway east of Los Angeles.
“Do I think our drivers think that our vehicles are autonomous? No,” the engineer added according to a trial transcript seen by Reuters. When pressed further, however, the Tesla engineer did concede that Tesla vehicles sold in 2019 might have latent software defects, which could lead to inaccurate vehicle identification and crash avoidance. It will be interesting to see how this testimony impacts the ultimate verdict.
Side Note: While coverage of this trial has been surprisingly sparse, according to The Washington Post (below), lawyers for Tesla also have argued to the California jury that Autopilot “is basically just fancy cruise control.”
The final 11 seconds of a fatal Tesla Autopilot crash (The Washington Post, October 6, 2023)
A second Tesla trial that was scheduled to begin this week in Florida has been delayed while the court considers Plaintiff Kimberly Banner’s request to seek punitive damages. Banner claims that Tesla is liable for the 2019 death of her husband, Jeremy Banner, who was killed when his Model 3 drove under the trailer of a semi-truck at 70 mph. It is undisputed that the Model 3 was operating in Autopilot at the time of the crash, and that the system failed to warn of the crossing semi or apply the brakes.
Based on an extensive review of the docket and other public information, The Washington Post reconstructed the Banner crash in this report, and found that “braking just 1.6 seconds before impact could have avoided the collision.”
Per usual, Tesla’s defense turns on its user manuals and disclosures of the system’s “limitations.” But, as The Post notes, the trial will certainly focus on Tesla’s (and Musk’s) public representations about its “self-driving” technology and its failure to limit the use of Autopilot to certain locations and conditions. “The outcome could prove critical for Tesla, which has pushed increasingly capable driver-assistance technology onto the nation’s roadways far more rapidly than any other major carmaker. If Tesla prevails, the company could continue deploying the evolving technology with few legal consequences or regulatory guardrails.”
Study: Consumer Trust In Self-Driving Vehicles Low and Sinking (Kelley Blue Book, October 5, 2023)
A new survey conducted by J.D. Power’s suggests American drivers remain highly skeptical of autonomous vehicles, although confidence improves with exposure to the technology. Key survey findings include:
“Consumer readiness for automated vehicles” scored a failing grade at just 37%, a two-point decline from last year.
At the same time, respondents who had ridden in a robotaxi scored 67% for consumer readiness.
Respondents overwhelmingly do not understand the differences between the five levels of automation, with researchers finding that “[t]here is no distinction in the activities that consumers are willing to do in a vehicle (e.g., talking, texting, online searching) as the level of automation increases.” Underscoring this finding, twenty-two percent of respondents incorrectly indicated that Teslas are fully automated.