The Thorny Issue of AI Use by Law Firms – Being a Luddite vs. Being Held Liable
What are the thorniest issues when it comes to the use of AI by lawyers?
April 21, 2020 at 06:07 AM
5 minute read
That lawyers are generally hostile to technology has become something of a tired line. While the legal industry may not always deserve such a reputation, it is true that the law tends to move slowly, making rules for technological advances after they have happened.
When it comes to AI, there are many important questions that are yet to receive considered answers from the courts or Parliament. For lawyers deciding whether or not to embrace AI this poses a dilemma: do you risk becoming a Luddite, or liable for the consequences of the new technology?
While there are unfortunately more questions than answers in this area, being at least aware of those questions can help practitioners tease out the risks and factors they should be considering in deciding whether and when to deploy new technology.
Take the example of predictive coding software which, in the context of technology assisted review, has become increasingly effective at identifying relevant documents in a disclosure set. This technology has proved so effective that the CPR Disclosure Pilot more or less obliges litigants to discuss using it when the disclosure process will include searches.
This technology moves beyond traditional 'if/then' logic in computer programming, and instead revises its logic as it 'learns' through the disclosure process and 'training' by a lawyer who is also reviewing some of the documents.
Now consider a scenario where a loss has been caused by a negligent failure to pick up on a key document during the disclosure process, and factually that failure is the sole fault of (i) a lawyer, or (ii) software operating with a traditional 'if/then' logic or (iii) a more advanced software trained by the lawyer and then applied to the disclosure.
In circumstance (i), it is clearly the lawyer who is at fault. In circumstance (ii), it is the software developer, who created a defective program and sold it as a bespoke solution to these sorts of disclosure problems.
However, liability in circumstance (iii) is more complex and raises a number of questions:
- As a matter of principle, can it be said that the loss was caused by the developer who created the software which incorporated the capacity to develop reactively during the review process? Or was it caused by the lawyer who developed that same capacity by training the software? Should it differ depending on the facts of a particular case?
- As a matter of fact, how can you prove whose fault the failure is? It may be impossible to say how or why the software missed the particular document. In these circumstances, can you rely on the general range of accuracy reasonably expected from the software to show a culpable failure in the particular circumstances? Where should the burden of proof lie in these circumstances?
- It is likely that an alleged failure of this type of software cannot easily be explained in the conventional terms used for assessing mechanical fault (if X happens, Y should properly then happen) or human error (in these circumstances, a reasonable person would do Z). How then do you assess liability? Do you compare the software to a hypothetical reasonable human lawyer? What would this mean for justifications for the use of such software due to its improved efficiency over human review? Alternatively, do you compare it to another program? Would this mean that only the market-leading option can escape liability?
- How can the duty to exercise reasonable care to the standard of a reasonably competent solicitor be discharged in this context? Will it be sufficient as a defence to rely on statistics showing a particular software's general rates of accuracy versus a traditional human review? Or will each case involve a forensic examination of the human training inputs to try to achieve some 'best guess' as to whether it was the lawyer's defective training of the software that led to the failure? Should the Disclosure Pilot's tacit encouragement of such technology play any part in assessing liability? At the very least, a comprehensive written advice documenting all of these considerations should offer some protection for law firms.
- From a more practical perspective, how might insurance policies cover (or refuse to cover) liability in this context? This is a conversation that might be had with a provider before any issues have arisen.
These are just a few examples of the complex issues one particular application of the use of AI might raise. Unfortunately, given there are as yet no clear answers, any disputes arising from such technology are likely to be protracted with lawyers and developers seeking to blame one another.
While we are still waiting for clarification in the law on these issues, having them to the forefront of one's mind when considering or deploying such technology will at least help firms stay alive to the potential risks involved and identify common sense ways to minimise these in their particular circumstances.
Sinead O'Callaghan, Michael Cumming-Bruce and Andrew Flynn are partner, senior associate and associate respectively within the partnership disputes team of CYK
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllWhat to Expect From Teresa Ribera, the EU‘s New Competition Commissioner
6 minute readTrending Stories
- 1Senate Confirms Last 2 of Biden's California Judicial Nominees
- 2Morrison & Foerster Doles Out Year-End and Special Bonuses, Raises Base Compensation for Associates
- 3Tom Girardi to Surrender to Federal Authorities on Jan. 7
- 4Husch Blackwell, Foley Among Law Firms Opening Southeast Offices This Year
- 5In Lawsuit, Ex-Google Employee Says Company’s Layoffs Targeted Parents and Others on Leave
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250