Autonomous Car Fatalities: A Question of Liability

Autonomous Car Fatalities: A Question of Liability

A 49-year-old woman died on March 18, 2018, after an autonomous Uber vehicle, a Volvo SUV, struck her while she was crossing a road in Tempe, Arizona. The facts are:

  • A “safety” driver was behind the wheel who did not take measures to avoid the crash, but who was found to be unimpaired.
  • The woman crossed at night, but not at a crosswalk.
  • No passengers were in the Uber vehicle at the time.
  • The SUV was part of Uber’s Phoenix-area self-driving vehicle testing. Post-collision, Uber halted all such testing it was conducting, in Tempe, Pittsburgh, San Francisco, and Toronto.

Preliminary police reports do not appear to implicate Uber or the safety driver. It should be noted that Arizona has the friendliest laws in the U.S. when it comes to testing and using autonomous vehicles.

Nagging Questions

But the accident opens up a number of questions regarding motor vehicle crash liability. Why didn’t the safety driver attempt to intervene by touching the steering wheel? Why didn’t the autonomous system slow or stop the car, which was traveling at approximately 38 miles per hour? Was there something inherently wrong with the car, or with the technology, so that it didn’t “see” the pedestrian?

Numbers from the National Highway Traffic Safety Administration tell us that driver error causes a staggering 94 percent of all motor vehicle crashes. The number excites everyone when autonomous cars are discussed: imagine being able to prevent 94 percent of all crashes. Even if you couldn’t literally prevent the entire 94 percent, imagine being able to greatly reduce the number of crash injuries and deaths.

But software is far from perfect, and some experts have argued that driving requires so many decisions involving unpredictable human behavior that no computerized system could possibly get it right all the time. When the technology is wrong, it could be deadly wrong.

In cases where the autonomous driving technology, or some combination of the technology and human error, is wrong, what happens when someone is seriously injured or killed? If a court case is brought, who might be found at fault? Would the case be one of negligence as in a typical at-fault collision, or would the technology be to blame, bringing product liability law to the fore? (Product liability cases find liability without assigning legal fault or negligence.)

Complicated Issues

One California attorney believes that the issue of negligence would come into play with autonomous vehicles. Sergei Lemberg thinks a number of parties in the Tempe crash could be held liable, including Uber, Volvo, the makers of the autonomous driving technology, and the safety driver who appears to have done nothing to prevent the crash (according to available on-board video). Lemberg commented, “Suing all these parties would be my top thought right now.”

But others in the legal profession believe that product liability laws could best guide us in determining fault in autonomous vehicle cases. UCLA law professor John Villasenor argued in a 2014 paper published by the Brookings Institution that “[p]roducts liability has been one of the most dynamic fields of law since the middle of the 20th century.”

Likewise, Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina, thinks that, as driverless cars become more prevalent, cases involving crashes will shift the fault to product liability and away from negligence. Smith pointed out that drivers are legally required to exercise “due care” to avoid hitting pedestrians, regardless of where they cross; therefore, autonomous vehicle software must have the ability to “see” pedestrians, like the woman in Tempe, who cross outside a crosswalk.

Still others think that any product liability issues will focus specifically on the area of defective design. A defective design claim would need to demonstrate that an inherent failing in the product’s design made it unsafe to use.

As of now, there are no answers. But autonomous vehicles deeply bother three-fourths of U.S. citizens. They may be coming to Indiana via House Bill 1341, however, which was introduced in early 2018. It remains to be seen whether the fallout from the Uber crash will affect the future of driverless cars in Indiana.

Hand the worry over to us and let our resources back you up.

When you are considering hiring an Indianapolis vehicle accident lawyer, you should look for an attorney who will provide you with competent and compassionate representation that has a “client first” approach. The attorneys of McNeely Stephenson have been successfully litigating personal injury cases in Indiana since 1982. We know how to conduct a thorough investigation into an accident’s causes. Our decades of representing those injured in crashes have helped us build a network of medical experts, economists, and others who can assist in documenting a victim’s injuries and financial losses. We will fight for your rights when you have been harmed in a crash on Indiana highways.

You can be assured that our attorneys, Mike Stephenson and Brady Rife, are willing to go the distance on behalf of your family or in the memory of your loved one. We offer free consultations and would like to discuss how we can be of service to you. Call us today or, if you prefer, use our confidential online contact form.