Nov 21, 2024

The Road to the Cars That Drive Us Still Congested With Legal Liabilities

by Haley Larkin | Dec 14, 2021
A person interacting with the touchscreen interface of a Tesla vehicle, showcasing autonomous driving features. Photo Source: Adobe Stock Image

In early October 2021, Tesla CEO Elon Musk tweeted that “FSD Beta 10.2,” the latest version of the company’s “self-driving” software, would be rolled out to 1000 of Tesla’s top-rated owners based on their safety scores. As more companies in the auto industry turn to fully autonomous vehicles and enhanced autonomous features, legal concerns over liability remain at both the state and federal levels.

According to the National Highway Traffic Safety Administration (NHTSA), fully autonomous vehicle technology makes up the fifth era of safety in the transportation industry. The first era began in the 1950s with the advent of antilock brakes, seat belts, and cruise controls. Then, the early 2000s brought in blind spot detection, forward collision warning, and lane departure warning. From 2010 through 2016 the industry developed rearview video systems and automatic emergency braking. Most recently, the safety features in newer models on the road can self-park, assist with traffic jams, and utilize adaptive cruise control. The NHTSA estimates that from 2025 onward, cars will have fully automated safety features and highway autopilot.

The NHTSA estimates that 94% of all serious crashes are due to human error or intent, paving a positive light for the arrival of automated vehicles and their potential for saving lives. The agency argues that for roads “filled with automated vehicles,” traffic flow will actually smooth out and eventually reduce congestion altogether. A study by McKinsey & Company in 2015 showed that reduced traffic from automated vehicles could free up as much as 50 minutes each day per person.

While there is hope for a reduction in car accidents with more autonomous vehicles on the road, this doesn’t mean they won’t ever happen, nor does it mean there won’t be a learning curve as more states allow testing on their roads. When an autonomous vehicle collides with another car, pedestrian, bicyclist, etc., some experts believe that the liability shifts from the vehicle’s owner (usually the driver) to the manufacturer of the car.

With more companies turning to advanced autonomous features in their vehicles, the ability for companies to test and deploy these designs still relies on states to pass legislation to govern the design, testing, and deployment. Some companies, like Tesla, still require their beta testers to keep their hands on the wheel, creating a gray area of fault should the car collide with another car or human.

As Palmdale auto accident injury lawyer Paul Kistler points out, the laws regarding driverless cars or self-driving vehicles are relatively new and have not been comprehensively enacted or definitively tested in court. “What we can look at,” says Kistler, “are the current laws and analogize them to the technological application of our modern world.” Kistler points to California Vehicle Code section 38750, which requires self-driving vehicles to have a safety alert system that sets off an alarm or warning when the self-driving technology is malfunctioning. “This feature would enable the human driver to act as “co-pilot” and resume manual driving functions,” Kistler explains. An apt analogy is an airplane on automatic pilot; the human pilot and co-pilot are ultimately responsible if an aviation accident occurs. Therefore, Kistler concludes that “absent a showing that the alarm system malfunctioned, a Tesla beta tester could reasonably be liable for a vehicle accident even it if was in full automation mode at the time of the crash.”

The analysis of an auto accident would have to consider the conduct of all vehicles involved regardless of whether or not they were operated by a human.If an AV vehicle was shown to be operating in accordance with reasonable driving standards, the law should not impose a greater duty regardless of the operator.
— Paul Kistler, Personal Injury Attorney

Generally, states establish their own liability and insurance rules. In September 2016, the Department of Transportation released its “Federal Automated Vehicles Policy.” The document delivered general performance guidance for manufacturers on best practices for design and testing.

The federal policy put most of the responsibility on state governments to facilitate the safe deployment of this technology within their jurisdictions. The “Model State Policy” outlined by the Department of Transportation “confirms that States retain their traditional responsibilities for vehicle licensing and registration, traffic laws and enforcement, and motor vehicle insurance and liability regimes.” Furthermore, the policy puts the onus on states to “consider how to allocate liability among owners, operators, passengers, manufacturers, and others when a crash occurs.”

In 2016, Michigan became the first state to pass legislation that addressed self-driving cars, enhanced autonomous vehicle technology, and insurance. Departing from the norm, their policy holds the manufacturer responsible for fault for a collision rather than the owner of the vehicle.

Presently, 38 states and the District of Columbia have enacted some sort of legislation or executive order pertaining to Autonomous Vehicles (AVs). Five of those states have authorized the study of AVs, 12 have authorized testing, and 16 states and the District of Columbia have authorized full deployment. Eighteen states allow testing without a human operator and four states regulate truck platooning.

In June 2021, California approved Cruise, a subsidiary of GM, to transport riders without a human operator in a test of their autonomous vehicles. The permit is the first requirement in California that a company must obtain before an autonomous vehicle company can deploy commercially.

Supporters of autonomous vehicles claim that crashes would be virtually nonexistent if all vehicles were autonomous. Until that time, even if the AV is performing as it should, a negligent human driver could collide with it. In that situation, would an injured occupant of the AV pursue a negligence claim against the at-fault driver like any other accident? Could the negligent human driver claim the AV driver was also partly at fault for not maintaining control of the vehicle?

Attorney Kistler points out that “comparative fault” states like California apportion liability according to each driver’s degree of negligence that contributed to the accident. “The analysis of an auto accident would have to consider the conduct of all vehicles involved regardless of whether or not they were operated by a human,” says Kistler. He continues, “If an AV vehicle was shown to be operating in accordance with reasonable driving standards, the law should not impose a greater duty regardless of the operator.”

Share This Article

If you found this article insightful, consider sharing it with your network.

Haley Larkin
Haley Larkin
Haley is a freelance writer and content creator specializing in law and politics. Holding a Master's degree in International Relations from American University, she is actively involved in labor relations and advocates for collective bargaining rights.

Related Articles

Close-up of two cars that have been involved in a front-end collision, highlighting the damage to the front of each vehicle.
Ways of Determining Negligence

In the field of personal injury law (negligence), there are different ways to determine how responsible each party is and how damages should be distributed, and every state is different. This article describes the types of contributory or comparative negligence found in jurisdictions around the country. When more than one... Read More »