Brave New World of (Robot) Law
If your robot hurts someone, are you liable? Legally speaking, is a robot like a pet? An employee? Should robots have rights? In a new report, the U.S. Chamber Commerce contemplates not-so-distant questions about robot law.
April 19, 2018 at 11:43 AM
6 minute read
The original version of this story was published on Litigation Daily
It's the stuff of science fiction made real: robot law.
If your robot hurts someone, are you liable? Legally speaking, is a robot like a pet? An employee? Should robots one day have a form of legal status that makes them responsible for their own actions?
The U.S. Chamber Institute for Legal Reform on Wednesday released a 93-page report, Torts of the Future II: Addressing the Liability and Regulatory Implications of Emerging Technologies. It dives deep into future liability trends involving virtual and augmented reality, wearable devices and 3D printing. But the section that struck me as most interesting—and most immediate—concerns robots and artificial intelligence.
“As autonomous robots and other products with AI make their way into the workplace, provide medical care in hospitals, operate on public highways, and serve us in our homes, hotels, and stores, they will be involved in incidents that result in personal injuries and other harms,” the report states.
What's less clear is how the legal system will assign fault when compensating people who are injured.
An early—and terrifying—case shows how difficult it may be.
Last year, I wrote about Wanda Holbrook, a Michigan woman who worked in an auto parts factory. She was killed by a robot that inexplicably left its section and came into hers, where it “hit and crushed [her] head between a hitch assembly.”
Why did it do that? No one seems to know.
On behalf of her estate, her husband sued the five companies that designed, built and monitored the robot. (Though not her employer, Venta Ionia—Michigan law requires plaintiffs to prove there was an intentional effort by the employer to harm the worker, which no one suggests happened here.)
More than a year later, the case in U.S. District Court for the Western District of Michigan hasn't gotten far. The defendants in motions to dismiss all (unsurprisingly) claim that they did nothing wrong, that Holbrook was negligent, and that if there was a problem with the robot, then it was someone else's fault.
There's another line of defense as well: “At the time of the alleged injuries, [auto parts maker] Flex-N-Gate did not have control over the subject products,” wrote Pepper Hamilton of counsel James VandeWyngearde.
That may increasingly be a point of contention. As the Chamber report notes, “Robots imbued with AI will have functionality far beyond that of automated equipment and machines. They will move and act autonomously, make decisions and learn from experience, and grow in capability beyond their initial programming.”
The report continues, “In the future, a key overriding issue with respect to robotics and AI will be whether a designer's or manufacturer's conduct can continue to be evaluated under product liability principles when a product is learning and changing after its sale.”
Manufacturing defects are subject to strict liability—but that's problematic if the product goes on to develop its own unique behavior.
“Whether a product has a manufacturing flaw is evaluated based on its condition at the time of sale. This would preclude a manufacturing defect claim when an AI product was manufactured to design specifications but later changed,” the report states.
But that doesn't mean robot makers would be off the hook. The key question instead could be negligence—was the product's action reasonably foreseeable?
Are you fascinated by the intersection of law and technology? Check out What's Next by Ben Hancock, a weekly briefing that delves into the legal complexities of AI, blockchain, Big Data, cryptocurrency … and, yes, robots. Learn more and sign up here.
The report also contemplates treating robots akin to employees. For example, if a drone delivering a pizza crashes into something, the pizza restaurant might be liable, just as it would be if a human driver got into an accident.
Or perhaps the law as it applies to pets would be a good fit. As in, you've got a general duty to stop your dog from biting anyone, but if the dog never before showed signs of being vicious, or was provoked, or you posted a “Beware of dog” sign, liability can be more nuanced.
Also, there are laws against animal cruelty—which means pets have some rights.
This approach “might appropriately balance owner responsibility, robot unpredictability, the level of risk of the particular robot based on its function, and the conduct of the person who was injured. It also opens the door to providing legal protections for AI entities, when warranted.”
This strikes me as vaguely disturbing. Maybe it's because I've seen The Terminator movies too many times. (“Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug. Skynet fights back.”) Maybe it's Elon “AI-poses-a-fundamental-risk-to-the-existence-of-human-civilization” Musk getting to me.
But the Chamber of Commerce emerges as an unexpected champion of robot rights—though not in a Bicentennial Man/ Battlestar Galactica free the robots kind of way
The Chamber takes a strictly pragmatic view. After all, corporations are “persons” under the law, and that suits the group just fine. So perhaps robots should be able to enter into contracts and even own intellectual property for creating software codes, art, music, or books.
As the report notes, corporate owners “benefit financially from the corporation's intellectual property and other property rights. Like corporations, which possess legal rights, it seems likely that AI entities with property and other legal rights will also be subject to ownership, and that their owners will also be the ultimate financial beneficiaries.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1The Key Moves in the Reshuffling German Legal Market as 2025 Dawns
- 2Social Media Celebrities Clash in $100M Lawsuit
- 3Federal Judge Sets 2026 Admiralty Bench Trial in Baltimore Bridge Collapse Litigation
- 4Trump Media Accuses Purchaser Rep of Extortion, Harassment After Merger
- 5Judge Slashes $2M in Punitive Damages in Sober-Living Harassment Case
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250