One of the most complex legal issues we will face with the rollout of autonomous vehicles is assigning liability when an accident occurs. In an accident involving two human-operated vehicles, the basic question of fault usually has a simple answer: human driver negligence, such as speeding, distracted driving, intoxication, or some other form of negligence. The question of fault with autonomous vehicles will evolve toward manufacturers of the vehicle, software vendors, and data providers. In thinking about this challenge, I anticipate a hybrid liability system in the legal system wherein product liability is melded with traditional negligence. Courts will have to grapple with some difficult questions: Did the vehicle have a design defect? Was there a defect or error with the software? Was it a defect or malfunction with the sensor? Was it human misuse? And how do we apportion fault when the "driver" did not actively drive for the majority of the trip, but was still expected to act appropriately if the vehicle encountered an emergency? A particular challenge I foresee involves access to the vehicle data. Recognizing the importance of black-box data regarding what the autonomous vehicle observed, how it behaved, and whether it failed, manufacturers may decline to disclose below-the-hood data citing trade secrets and/or proprietary concerns. I expect new law or a new line of cases will need to address how injured persons or their agents gain access to this information on a timely basis. Existing devices and law, such as giving notice to permit discovery of product liability in traditional litigation, could be useful for addressing this new issue for injury recovery.
We're still arguing fault in basic rear-end collisions, and now we're facing accidents where the "driver" is an algorithm written by a third-party vendor that doesn't even appear on the vehicle title. One of the biggest challenges will be proving machine error in a human courtroom. These cases may fall under product liability statutes, but when an autonomous vehicle crashes, fault could rest with the manufacturer, the software developer, the fleet operator, or all three. And none of them are eager to share source code or telemetry data. They'll argue it's proprietary or irrelevant, which puts victims at a huge disadvantage. We'll need a new kind of discovery process that includes data audits, code analysis, and likely court-appointed tech experts to interpret what went wrong. Until that becomes the norm, the burden of proof will be unfairly high. For many victims, it may be too high to climb.
I think the legal system will adapt to autonomous vehicles by drafting exclusions for drivers using autonomous vehicles in their policies. There will most likely be separate, and expensive, policies that cover autonomous vehicles that drivers and companies can opt into. I don't believe that many drivers will opt into these policies, leaving gaps in insurance coverage. I foresee that we will start seeing a lot of personal injury claim denials for people involved in accidents with autonomous cars based on policy exclusions, similar to what we see with drivers who use their personal vehicles for rideshare services. It will then be dependent upon the injury victim's uninsured motorist coverage to step in and pay for their damages.
As autonomous vehicles become more common, I think the legal system's going to have to shift from focusing on driver responsibility to software and manufacturer accountability. One big challenge I see is figuring out who's liable when an AI-driven car causes an accident—is it the owner, the software developer, or the automaker? Unlike traditional accidents, there may not be a human error to point to. That means courts will need new frameworks to assess fault, especially when decisions are made by algorithms in real time. I believe we'll start seeing more cases where liability hinges on how well the AI was trained and whether the manufacturer took reasonable steps to prevent harm. It's a whole new ballgame, and the legal system's going to need to catch up fast.
A significant legal issue that I envision is the liability in autonomous vehicle accidents. Contrary to the conventional car crash, where the driver is most of the time to blame, the autonomous vehicles implicate three additional parties: the car manufacturer, the software developer and possibly, the owner. The law will have to establish an appropriate guideline that will determine who is liable between the technology provider and the human driver (assuming he or she is the driver), or a combination of both. I expect new laws and case law to be developed to create shared liability models, and compulsory data logging to replay events and be able to figure out fault correctly.