Tesla Inc. failed to dismiss a proposed class action lawsuit accusing it of concealing a defect in its vehicles' braking systems that allegedly causes them to stop automatically without a collision risk, according to a ruling by U.S. District Judge Georgia Alexakis in Chicago on Friday. While the judge dismissed... Read More »
Tesla Driver Faces Felony Charges After Killing Two While on Autopilot
In what appears to be the first case of its kind, a driver of a Tesla is facing felony charges via two counts of vehicular manslaughter after his electric car was operating on autopilot when it struck another car, killing two individuals.
Kevin George Aziz Riad, 27, was behind the wheel of a Tesla Model S and was using the car’s autopilot driving system, a feature that allows drivers to operate the vehicle hands-free. Riaz was moving at a high rate of speed when he got off the freeway and ran a red light, striking a Honda Civic at an intersection in a Los Angeles suburb. The two individuals in the Civic, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, both died at the scene. Riad and his female passenger were both hospitalized but faced non-life-threatening injuries.
An investigation into the accident was later conducted by the National Highway Traffic Safety Administration. The NHTSA investigators were able to determine that the Tesla’s autopilot feature was in use at the time of the crash, although the charging documents do not mention autopilot being engaged.
The popular car brand defended itself against allegations that the car company’s autopilot feature is unsafe. “The Model S meets or exceeds all of Tesla’s internal standards as well as applicable industry standards, including but not limited to those promulgated by the American National Standards Institute,” Tesla shared.
Tesla has long since touted its ambitious goals of offering innovative and revolutionary self-driving technology. However, the auto-pilot driving feature has come under scrutiny over the past decade as drivers have seemingly come to rely on the feature entirely as they commute.
There have been countless reports of drivers falling asleep being the wheel as the auto-pilot feature drives the vehicle as well as drivers lounging in the backseat, ditching the driver’s seat entirely, while their car is in motion. Tesla shares on its website that the company’s autopilot feature is “designed to assist” drivers and that drivers should still maintain “active driver supervision” and that they “do not make the vehicle autonomous.”
This isn’t the first time Tesla’s autopilot technology has raised safety concerns following a fatal crash. In 2019, a Florida driver was using the autopilot feature when he bent down to pick up his cell phone. At that moment, the driver’s Tesla sped past a stop sign and blinking red traffic lights and crashed into another vehicle, killing a 22-year-old college student.
The families of Lopez and Nieves-Lopez have since sued Riad and Tesla in separate lawsuits. The families accuse Riad of negligence and Tesla for selling vehicles that can accelerate without warning and lack effective braking systems. The Lopez family details in their suit that Tesla should be held responsible because the vehicle “suddenly and unintentionally accelerated to an excessive, unsafe, and uncontrollable speed.” Tesla has yet to respond to the claims made in the lawsuit.
The lawsuits will challenge when a driver should be held responsible in the event a vehicle’s autopilot feature is on and to what extent. If Tesla is found to be liable for having put a dangerous high-performance vehicle on the road, this could have a possible impact on regulations surrounding self-driving vehicles and how drivers operate them.
Related Articles
Tesla is facing yet another lawsuit over its autonomous driving technology after a Colorado man was killed in a fiery crash when his Tesla veered off the road and drove into a tree. The crash happened in 2022 when 33-year-old Hans Von Ohain was behind the wheel of his Tesla... Read More »
A Florida judge has ruled that a lawsuit accusing Tesla of overpromising on its vehicle's Autopilot system can go to trial. The lawsuit was filed by Kim Banner when her husband, Jeremy Banner, was killed in 2019 after engaging the Autopilot system. Banner was traveling on a South Florida road... Read More »
In early October 2021, Tesla CEO Elon Musk tweeted that “FSD Beta 10.2,” the latest version of the company’s “self-driving” software, would be rolled out to 1000 of Tesla’s top-rated owners based on their safety scores. As more companies in the auto industry turn to fully autonomous vehicles and enhanced... Read More »