When an Autonomous Vehicle Crashes, Who Pays for Damages?
The consistent prediction is that there will be fewer total crashes as AVs become more proliferate. But just how many fewer crashes will occur, and what will those crashes cost insurance companies?
August 16, 2018 at 06:06 PM
7 minute read
The original version of this story was published on Texas Lawyer
Autonomous vehicles are no longer the stuff of science fiction, and so the chances of them occasionally failing and crashing are very much real—and so are the liabilities. Who will pay? Who will get sued? What causes of action will be alleged?
The automation of vehicles has been going on for longer than most of us realize. As far back as 1958, brochures for Chrysler Imperials trumpeted “Auto-Pilot,” described as “an amazing new device that helps you maintain constant speed and warns you of excessive speed.” That same year, an article in Popular Science opined that, “Like it or not, the robots are slowly taking over a driver's chores.” Anti-lock brakes have been available since the 1970s. Electronic Stability Control, a system that uses data from multiple sources to selectively apply the brakes on specific wheels of a vehicle to increase control on turns and slippery roadways, has been available since the 1990s. In many vehicles today, “driver assists” systems automatically apply a vehicle's brakes to avoid a collision if the vehicle's system determines that there is an imminent risk of collision with another vehicle, and automated parallel parking systems take over control of a vehicle as it is parked.
AVs navigate with a sophisticated array of systems and sensors, including onboard computers and software, global positioning systems, radar sensors that use radio waves, and LIDAR sensors that use light beams, among others. But AV systems do not always “see” well, and they do not always “think” well. There can be problems with the ability of AVs to track the center of the road well on roads that are poorly maintained or under construction, for example. Rain, snow, and other bad weather can create problems for AVs because LIDAR beams may reflect off of particles in the air, instead of reflecting off of obstacles, such as pedestrians and bicyclists. AV systems are programmed to “think” using certain assumptions about their surroundings and the probable actions of other drivers. When those assumptions are incorrect, crashes can occur. AV systems also do not react well to stationary objects that suddenly become moving objects. The AV system may disregard those objects, and instead “concentrate” on already-moving objects and on calculating their future trajectories and paths.
Predictably, AV crashes are now happening with some frequency. These crashes demonstrate that AVs are most dangerous when they are partially autonomous. Partially autonomous AVs can lull a driver or an attendant into a false sense that human vigilance is no longer required. Today, as we transition from partially autonomous vehicles to fully autonomous vehicles, drivers will continue to have obligations to stay alert and to assume control of the partially autonomous vehicle if necessary. It is an open question whether designers of AVs are up to the challenge of keeping these drivers vigilant and engaged.
In May 2016, a Tesla sports car being operated in autopilot mode crashed into a tractor-trailer rig in Florida, killing the Tesla driver. The Tesla system “tuned out” the long trailer across the Tesla's path because of the system's mistaken assumption that the trailer was a fixed object, such as a bridge. This March, a Tesla vehicle in autopilot mode crashed into a concrete highway divider in Florida and burst into flames, killing the Tesla driver and a passenger. The Tesla driver had ceded control of the vehicle to Tesla's autopilot system before the system inexplicably suddenly steered the Tesla directly into the barricade. That same month, an autonomous Uber vehicle struck and killed a pedestrian in a crosswalk in Arizona. Dash-cam video showed the backup driver in the Uber looking down more than 200 times during the 12 miles leading up to the crash. The attendant's cellphone records showed that she was streaming a television show on her phone for more than 40 minutes leading up to the crash.
Traditionally, most lawsuits arising out of car crashes have involved one operator suing another operator for negligence. As AVs proliferate, almost any crash could become an expensive, time-consuming products liability lawsuit against the AV manufacturer and designer. When an AV crashes, possible defendants will include the operator of the AV, the manufacturer of the AV or its component parts, the developer of the AV's software, or all three. Plaintiffs in AV crash cases will have all of the same products theories at their disposal that plaintiffs traditionally have had in products suits. Plaintiffs may assert that the AV was defective as a result of manufacturing defects, design defects, marketing defects, misrepresentations, and breaches of implied and express warranties.
Within the literature, there is a raging debate about whether the existing legal framework is flexible enough to accommodate AVs. The Brookings Institution maintains that products liability law has proved to be remarkably adaptive to new technologies, and that it will likewise adapt to autonomous vehicle technologies. The RAND Corp., on the other hand, believes that Congress should consider pre-empting state-court remedies to avoid multiple inconsistent state law regimes, and to speed the introduction of AV technology.
One by one, the states have passed AV laws, but the laws generally have imposed only minimal requirements on AV manufacturers and developers. In June 2017, Texas Gov. Greg Abbott signed Senate Bill 2205, which requires driverless vehicles on Texas roads to be capable of complying with all traffic laws, equipped with video recording devices, and insured to the same extent as cars with human drivers. SB 2205 provides that the owner of the automated driving system “is considered the operator of the automated motor vehicle solely for the purpose of assessing compliance with applicable traffic or motor vehicle laws, regardless of whether the person is physically present in the vehicle,” and that “a licensed human operator is not required to operate a motor vehicle if an automated driving system installed on the vehicle is engaged.”
AVs will test automobile insurers. Today, most sources confirm that over 90 percent of crashes are caused by human error. As AVs increase in number, crashes caused by human error will decrease, and the number of crashes caused by defects in software and components will increase. The consistent prediction is that there will be fewer total crashes as AVs proliferate. But just how many fewer crashes will occur, and what will those crashes cost insurance companies? And how quickly will AVs and their predicated associated safety benefits roll out? Some, including RAND, have called for “no-fault” automobile insurance, due to the expense and complexity of lawsuits against manufacturers and designers of AV systems. Others have criticized no-fault insurance because of a concern that no-fault regimes fail to punish transgressions and fail to incentivize manufacturers to avoid manufacturing dangerous products.
While questions about legal liability, insurance coverage, and governmental regulation remain unanswered, federal pre-emption and other major changes to our existing tort system should not be necessary. The existing system has been adaptable enough to evolve to accommodate the horseless carriage and countless other new products and innovations over the years. It makes no sense to scrap such a durable, time-tested system simply because there may be some inconsistencies in outcomes and some bumps along the road as AVs become more prolific.
Quentin Brogdon is a partner with Crain Lewis Brogdon in Dallas. He is a fellow in the International Academy of Trial Lawyers. His email is [email protected].
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllThe Path in the Multiverse: Rethinking Client Engagement Through Gamification
6 minute readTrending Stories
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250