How to Balance Privacy Compliance and AI Implementation
Once data does become subject to data privacy laws, compliance will fall on businesses to accurately map, monitor and understand how data is collected, used, shared and retained.
January 09, 2020 at 11:56 AM
6 minute read
Artificial intelligence and its use to analyze large, traditionally unwieldy sets of data has revolutionized the ways in which data is used. Privacy and data protection regulations and concerns have risen in tandem. These regulations typically concern any information relating to personal information (e.g., name, address, Social Security number, etc.) and provide a person with numerous rights related to its collection, use and access. For example, prominent data privacy laws such as Europe's General Data Protection Regulation and the California Consumer Privacy Act of 2018 regulate "any information relating to an identified or identifiable natural person" and "information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household," respectively.
As the power of AI increases, the amount and type of data that falls into these categories will increase. Where current AI technology may identify diseases in an MRI scan, future AI technology may fully identify a user. Notably, patient health records have already been at issue in data privacy fines for lax protection. Innovative uses for AI technology may likewise create data privacy compliance issues. For example, the Spanish soccer league, LaLiga, was recently fined for its use of location data and speech recognition technology aimed at preventing piracy.
Furthermore, once data does become subject to data privacy laws, compliance will fall on businesses to accurately map, monitor, and understand how data is collected, used, shared and retained. In order to maintain customer trust relative to AI, comply with the myriad of privacy and data protection regulations, and add value to customers by the collection and use of their data, businesses should consider the following:
Taking an inventory of AI practices
Businesses should understand what, where, and how AI is employed. For example, does the customer service team use a chatbot powered by AI? Does the finance team use AI for fraud detection? Does the security team using AI to predict and manage incidents? Documenting these AI practices, and considering whether to perform Privacy Impact Assessments or other compliance reviews to confirm adequate controls are in place and existing governing policies are met, is the first step to compliance.
Reviewing the Business Strategy for AI
Businesses should also understand the motivation for deploying AI. For example, who are the stakeholders developing the business strategies for AI? How does AI add value to service and product offerings and how is that value communicated to consumers? How does the business's AI strategy align to company culture and public statements made by its executive team and in public-facing statements such as financial disclosures and Privacy Statements? Notably, even tech giant Google, Inc. was fined for lack of transparency with regards to its personalized advertising.
Develop Governance and Policies for AI
With an understanding of what AI is used and the business strategy for using AI, businesses should consider leveraging existing governing policies for privacy, security and confidentiality guidelines. Several data protection principles should also be observed when developing AI: fairness of processing, purpose limitation, data minimization and transparency/right to information. For example, common features of data privacy regulations around the world allow consumers the right to access collected data, understand how the collected data is used, restrict how it is processed and remove it on request. Privacy laws also raise challenges with respect to consent, including the scope of use of personal data for ongoing and future training of algorithms and product development and validation.
Consider Ethics and Bias
Businesses may wish to publish AI or Data Usage Ethics Principles, as many companies (e.g., Microsoft) have done. These principles often go beyond what AI use is legally complaint and impose obligations related to ethics (what is the right thing to do with data and what are the customer expectations). In addition, policymakers have expressed increasing concerns regarding the implications of AI on issues of fairness, bias and discrimination. Businesses are well advised to confirm their policies and processes address controls related to rooting out bias and discrimination in the algorithm.
While consumers value privacy and security of their personal data, the potential of AI innovation to improve products and services is undeniable. Businesses cannot ignore novel implementations of AI technology due to concerns about data privacy issues, but neither can businesses ignore data privacy issues in the pursuit of AI innovation. Similar to Intellectual Property law, where AI has unsettled traditional IP concepts such as inventorship and protectability, and Competition law, where AI has raised fears about anti-competitive practices such as collusion and parallel pricing, businesses must understand that complying with data privacy laws when introducing a new AI features is as much a part of the AI innovation as the feature itself. In many ways the next technological revolution will be one based on data, the businesses that can best use that data, and in a legal, private, and secure manner, will be the winners.
Alexandra Ross is director, Global Privacy and Data Security Counsel at Autodesk Inc., a leader in 3D design, engineering and entertainment software. Previously she was senior counsel at Paragon Legal and associate general counsel for Wal-Mart Stores. She is a certified information privacy professional (CIPP/US, CIPP/E, CIPM, CIPT, FIP and PLS) and practices in San Francisco. She holds a law degree from Hastings College of Law and a B.S. in theater from Northwestern University. She is a recipient of the 2019 Bay Area Corporate Counsel Award-Privacy. She also launched The Privacy Guru blog in January of 2014 and has published an ebook Privacy for Humans (available on Amazon and iTunes).
Drew J. Schulte is counsel at Pillsbury Winthrop Shaw Pittman. As an attorney admitted to practice before both the U.S. and European patent offices, he brings a multijurisdictional approach to serving his clients' needs. With a tech background, specifically in intellectual property and data privacy, Drew prides himself on being equipped to handle the full spectrum of his clients' needs—whenever and wherever they may arise.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllImmigration Under the Trump Administration: Five Things to Expect in the First 90 Days
8 minute readSteward Health CEO Saga Signals Escalation of Coercive Congressional Oversight Against Private Parties
6 minute readTen Best Practices to Protect Your Organization Against Cyber Threats
7 minute readLaw Firms Mentioned
Trending Stories
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250