
Tesla will soon face a jury over a fatal crash involving its autopilot system. The trial, set to begin today in Miami, could be the first legal judgment regarding the increasingly normalized transportation tech.
The Elon Musk-owned company has weathered several lawsuits levied against its autopilot driver-assist system in recent years, all of which have previously been dismissed or settled. The latest suit was brought forward by the family of Naibel Benavides, the victim of an April 2019 crash involving a Tesla Model S sedan with an allegedly defective autopilot, and her boyfriend, Dillon Angulo, who was severely injured in the same south Florida wreck. Benavides, a 22-year-old college student, and Angulo were standing outside an SUV when they were struck by the Tesla, which was driven by George Brian McGee.
Tesla contends that the autopilot feature was not fully activated at the time of the crash, which resulted when McGee dropped his cellphone and reached down to find it, smashing into the parked SUV and surrounding pedestrians, according to case documents reviewed by the New York Times. "The evidence clearly shows that this crash had nothing to do with Tesla’s Autopilot technology. Instead, like so many unfortunate accidents since cellphones were invented, this was caused by a distracted driver,” a Tesla spokesperson told the Times. McGee was allegedly driving nearly 62 miles per hour in a 45 m.p.h. zone, and pressed the accelerator before the crash, which overrode the autopilot's cruise control.
But the plaintiffs believe that the crash should have been prevented by Tesla's advertised attentiveness features and automatic emergency braking system — according to video obtained from the vehicle's computer, the autopilot system recognized the presence of the parked car and at least one person, but didn't activate its breaks or alert the driver to the obstacles. The plaintiffs' legal team plans to depose Tesla Autopilot engineer, David Shoemaker, and two other employees.
For years, Musk has repeatedly claimed that Tesla's autopilot features, including those powering the company's new fleet of autonomous vehicles, are empirically safe for drivers and pedestrians. Experts who study autonomous vehicle safety, however, aren't as quick to claim the data as sound and the technology ready for widespread use, noting continued issues with unexpected driving behaviors like unpronounced disengagement, roadblock confusion, and phantom breaking. In addition, popular "robotaxi" companies Waymo and Zoox are currently under the microscope of the federal government, including 22 reported incidents by Waymo vehicles that the NHTSA began investigating in 2024.
In June, Tesla appealed to a judge to block the public disclosure of vehicle crash data in a case with the Department of Transportation's National Highway Traffic Safety Administration (NHTSA), arguing that it would threaten their competitive advantage in the market. In addition to repeated incidents of Tesla vehicles and robotaxis acting unpredictably, analysts have accused the company of obscuring and de-contextualizing safety data in its Autopilot Safety Report.