AI Is Rising, and Governments Are Starting to React
Recent regulations and bills are the early stages of a reaction to larger trends—each brought about by the increasing adoption of a grab bag of technologies commonly labelled 'AI.'
February 28, 2018 at 08:00 AM
5 minute read
Ever since tech mogul Elon Musk told the National Governors Association last summer that artificial intelligence, or AI, must be regulated, lawmakers and lawyers have been focused on what it would actually mean to regulate AI.
Indeed, when searching for statutes that deal with machine learning or artificial intelligence, it's striking how few provisions address the issue. “Artificial intelligence” and “machine learning” appear just five times in the United States Code and just four times in the Code of Federal Regulations. Various states have statutory approaches to artificial intelligence, but there are few, if any, substantive statutes dealing directly with these issues.
This is not, of course, to suggest that there is no regulation of algorithms. In the U.S., certain regulatory bodies have long been thinking about regulating the use of algorithms. Financial regulators, for example, have been overseeing the use of algorithms and automated decisions at financial firms for at least a decade. Spurred in part by issues with faulty modeling assumptions that contributed to the financial crisis in 2008 and 2009, the Federal Reserve Board and Office of the Comptroller of Currency issued SR 11-7, which requires financial institutions to keep track of the internal models they use.
Meanwhile, the European Union, in implementing international capital requirements mandated by Basel III, included provisions similar to SR 11-7 in its regulations. In other areas, the Food and Drug Administration is developing regulations of machine learning algorithms radiologists use to assist in diagnosing diseases.
But of all efforts to address the rising impact of AI, the most wide-reaching statute regulating the use of algorithms is the EU's the General Data Protection Regulation, or GDPR. The GDPR directly regulates the use of algorithms applied to personal data in the EU and will begin to be enforced in May of this year. With fines of up to 4 percent of the violator's parent company's global revenue, the penalties for noncompliance can be quite significant.
While regulators are still working out the details of its implementation, the GDPR appears to create a presumption that applying algorithms to personal data is unlawful, except in certain circumstances. The exceptions are, by design, quite narrow, including the exception that allows for user consent. The regulation also creates several substantive rights, including the right to receive some form of explanation when an algorithm makes a decision with certain effects. Exactly what this explanation must entail is the subject of much debate, as scholars Andrew Selbst and Julia Powles recently noted.
Legislatures in the U.S. appear to be watching the EU approach closely, but not yet willing to place as strict regulation on the books. There's a host of pending legislation at the state or the federal level, for example, and nearly all create a commission or committee to study the issues and provide recommendations to the legislature. The charges of these commissions give a good indication of the range of issues legislatures are concerned about. In Congress, the recent bipartisan FUTURE of Artificial Intelligence Act of 2017 would create a committee to draft recommendations on how AI will impact the workforce, education, accountability to international regulations, and societal psychology, among other subjects.
Meanwhile, bills in Virginia and Pennsylvania direct the study of the economic impact of automating jobs that once required a human. A bill in Vermont requires a study of the ethical use of artificial intelligence. Bills in Alabama and Nevada would authorize the use of autonomous vehicles in certain scenarios. And a proposal in Florida contemplates taxing automated systems.
New York City recently enacted a bill that calls for what is perhaps the most in-depth study of AI, which requires recommendations on issues such as bias that may work their way into algorithms. New York City's committee is charged with ensuring that individuals affected by autonomous decisions made by public bodies can receive further information regarding those decisions, among other tasks.
These are all early stages of a reaction to larger trends—each brought about by the increasing adoption of a grab bag of technologies commonly labelled “AI.” Cars, for example, are starting to drive without human assistance. Cell phones now process speech and perform tasks based on voice commands. In medicine, radiologists are using AI models to diagnose diseases.
At present, there remain a large number of questions about the law of AI, which will surely be the subject of further legislative debate and court review. But the technology is racing forward. It's only a matter of time until laws catch up.
Andrew Burt is chief privacy officer and legal engineer at Immuta, a data management platform for data science. Stuart Shirrell is a legal engineer at Immuta and a J.D. candidate at Yale Law School.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1We the People?
- 2New York-Based Skadden Team Joins White & Case Group in Mexico City for Citigroup Demerger
- 3No Two Wildfires Alike: Lawyers Take Different Legal Strategies in California
- 4Poop-Themed Dog Toy OK as Parody, but Still Tarnished Jack Daniel’s Brand, Court Says
- 5Meet the New President of NY's Association of Trial Court Jurists
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250