Tesla is partially liable in Florida Autopilot Trial, ju apprentices award $329 million in damages
A ju judge in a federal court in Miami was partially revealed due to a fatal crash in 2019, including Tesla’s use of its autopilot driver assistance system. The ju appellant awarded the plaintiff $329 million in punitive and compensatory damages.
Neither the car driver nor the autopilot system braked in time to avoid passing through the intersection. There, a car hit an SUV to kill a pedestrian. The ju apprentices assigned two-thirds of responsibility to the driver, with a third attributing to Tesla. (The drivers were sued separately.)
The verdict comes at the end of a three-week trial over the crash that killed 20-year-old Niberbena Videth Leon and seriously injured her boyfriend Dillon Anglo. The verdict was one of the first major legal decisions on driver assistance technology that opposed Tesla. The company has previously settled lawsuits Includes similar claims regarding autopilot.
Brett Schreiber, the lead lawyer for the plaintiffs in the case, Tesla “deliberately chose not to restrict drivers from using them elsewhere only for controlled access highways,” Tesla told Techcrunch in a statement.
“Tesla’s lies turn our path into a fundamentally flawed technology test track, harming everyday Americans like Nybel Benavides and Dillon Anglo,” Schreiber said. “Today’s verdict represents justice in the tragic death of Nybel and the lifelong injuries of Dillon, and Tesla and Musk are responsible for helping the company evaluate the trillion dollars with self-driving hype at the expense of human lives.”
Tesla said in a statement provided to TechCrunch it plans to appeal the verdict “in light of the substantial errors in law and irregularities at trial.”
“Today’s verdict is wrong, only putting the safety of cars at risk and the entire industry’s efforts to develop and implement Tesla and life-saving technologies,” the company wrote. “To be clear, there were no cars in 2019 and we didn’t prevent this crash today. This was not about autopilot. It was a fiction concocted by plaintiff’s lawyers who denounce the car when the driver acknowledged and accepted the responsibility from day one.”
Tesla and Musk have spent years claiming about the ability of Autopilots to bring excessive confidence to their driver assistance systems. This is a reality that government officials and Musk himself have spoken about for years.
The National Transportation Safety Board (NTSB) came to this resolution in 2020 after investigating a crash in 2018, in which a driver died after hitting a concrete barrier. That driver, Walter Huang, was Play mobile games while using Autopilot. The NTSB has made many recommendations following its investigation. Tesla was largely ignored, The Safety Committee later argued.
During a 2018 conference call, Musk said “self-satisfaction” with driver assistance systems like Autopilot was the problem.
“They’re just too used to it. It tends to be a problem. It’s not a lack of understanding of what autopilot can do. It’s (drivers) think they know more about autopilot than they do,” Musk said. I said it at the time.
The trial took place while Tesla was present Middle to unfold the first version of the long-standing Robotaxi networkIt starts in Austin, Texas. These vehicles use an expanded version of Tesla’s more capable driver assistance system. This is called fully automated driving.
Update: This story has been updated to include total compensation damages.