What We’re Reading
June 15, 2022
New report offers insight into autonomous vehicle crashes (CNN Business, June 15, 2022)
NHTSA has just released nine months of data from crashes involving advanced driving technologies, gathered in response to its 2021 Standing Order requiring reporting of such data by OEMs and operators. Findings from the data include:
367 crashes occurred in the last nine months involving vehicles using driver assist technologies, 273 of which involved Tesla’s FSD or AP systems.
Of 130 crashes involving fully automated systems, Waymo vehicles accounted for 62, Transdev (a shuttle operator) accounted for 34, and GM’s Cruise accounted for 23.
Of the 497 total crashes reported, 43% occurred in California.
The data set included six fatalities and five serious injuries.
During a briefing in advance of the release, NHTSA Administrator Steven Cliff commented that: “I would advise caution before attempting to draw conclusions based only on the data we’re releasing. In fact, the data alone may raise more questions than they answer.” For example, as this article notes, “the data lacks critical context like fleet size or the number of miles traveled, making it impossible to fairly compare the safety of the different technologies.”
Read more media coverage here, here, and here. Find the NHTSA release here. According to the release, NHTSA plans to provide data updates monthly going forward.
830,000 Teslas with Autopilot under NHTSA Investigation, Recall Possible (Car and Driver, June 11, 2022)
In a related story, NHTSA announced last week that it would upgrade its Preliminary Evaluation of Tesla’s Autopilot system to an Engineering Analysis. NHTSA started its investigation last August in response to multiple incidents where Tesla vehicles operating on Autopilot collided with stopped first responder vehicles. Several of these collisions were fatal. According to numerous reports, including this one, the upgraded NHTSA investigation signals a possible recall that could impact hundreds of thousands of Model S, Model X, Model 3, and Model Y vehicles of varying model years.
In announcing the upgraded investigation, NHTSA “said it has so far investigated 16 crashes and found that Autopilot only aborted its own vehicle control, on average, ‘less than one second prior to the first impact’ even though video of these events proved that the driver would have been able to see the potential incident an average of eight seconds before impact. NHTSA found most of the drivers had their hands on the wheel (as Autopilot requires) but that the drivers did not take evasive action in time. In four of the crashes, or 25 percent, the Tesla did not issue any ‘visual or chime alerts at all during the final Autopilot use cycle.’”
Myths about autonomous driving (Audi MediaCenter, June 3, 2022)
The focus of Audi’s “&Audi initiative” is “to trust in autonomous driving and the future of work in the age of AI.” The initiative recently released this report to address eight specific myths around autonomous driving. The report is derived from the initiative’s lengthier 2021 “SocAIty Study,” compiled through the collaboration of nineteen industry experts and entitled “Autonomous Driving on the Road to Social Acceptance.”
Among other things, this recent report explains that self-driving cars will not be “like normal cars, just without drivers;” instead, “design will focus on the interior in the future. Passengers’ comfort will be a priority, which is why their seats will no longer necessarily face in the direction of travel in certain use cases. This freedom of interior design will offer those on board a wide array of options for individually customizable experiences: communication or relaxation, work or retreat.” The report also notes the design changes that will be necessary in our driving environments, responding to a myth about where autonomous cars will be able to travel. And, the report dispels the myth that “self-driving cars will make driving less fun.”
To explore the report’s responses to all eight myths, click on the link above.