What We’re Reading

  • Tesla hit with class action lawsuit over alleged privacy intrusion (Reuters, April 10, 2023)

    Following last week’s reports that Tesla employees routinely shared highly invasive videos and images recorded on customers’ car cameras, a California owner has filed a putative class action lawsuit against Tesla in the U.S. District Court for the Northern District of California alleging that such routine sharing violated various state common and statutory laws.  According to the reports, Tesla employees could see customers “doing laundry and really intimate things.”  The lawsuit alleges that Tesla employees violated the privacy of customers by sharing recorded images and videos for their “tasteless and tortious entertainment” and “the humiliation of those surreptitiously recorded.”

    The plaintiff’s attorney, Jack Fitzgerald, commented that “Tesla needs to be held accountable for these invasions and for misrepresenting its lax privacy practices.” Citing Tesla’s “particularly egregious” and “highly offensive” conduct, the lawsuit seeks “to enjoin Tesla from engaging in its wrongful behavior, including violating the privacy of customers and others, and to recover actual and punitive damages.”  The proposed nationwide class includes individuals who owned or leased a Tesla within four years prior to the filing; the complaint also proposes a subclass of California residents who owned or leased a Tesla within the same time frame.

  • GM’s Cruise recalls 300 self-driving vehicles to update software after bus crash (Financial Post, April 7, 2023)

    GM’s Cruise autonomous vehicle unit has recalled 300 vehicles to update software after one of its robotaxis rear-ended a San Francisco Municipal Transit Authority bus last month. In documents filed last week with NHTSA, Cruise confirmed that the collision was caused by a software error that inaccurately predicted the movement of the “articulated” two-section bus as it pulled out of a bus stop.  As has become increasingly common, Cruise addressed the recall with an over-the-air update, correcting a flaw in the autonomous software that may cause the system to “inaccurately predict the movement of articulated vehicles such as buses and tractor trailers.”

    “Fender benders like this rarely happen to our AVs, but this incident was unique,” Cruise CEO Kyle Vogt said in a blog post. “We do not expect our vehicles to run into the back of a city bus under any conditions, so even a single incident like this was worthy of immediate and careful study.”

    As widely reported, NHTSA opened a formal safety probe into Cruise’s robotaxis late last year after receiving multiple reports of incidents involving improperly hard braking and vehicle immobilization. It appears that NHTSA is continuing to closely monitor and scrutinize the rollout of this nascent technology.

  • Who’s liable in a ‘self-driving’ car crash? (CWRU The Daily, April 11, 2023)

    CWRU School of Law Professor Cassandra Burke Robertson has published a law review article entitled “Litigating Partial Autonomy” to explore the “legal dilemma” of autonomous vehicles.  Robertson notes, “[i]f you ask automobile manufacturers ... they’ll tell you the driver is always fully responsible—even when supervised autonomy fails—because Advanced Driver Assistance Systems require constant human oversight, even when autonomous features are active.” But Robertson counters that “there’s enough blame for everyone” when autonomous systems are involved, including potentially the OEM; she hopes the legal system will evolve to fully address this shifting liability analysis, while leaving room for innovation.

Previous
Previous

What We’re Reading

Next
Next

Human vs. AI: Who will ultimately win the content creation battle?