'The Front Line of Regulating AI': Manatt's Brandon Reilly on CPPA's Move to Adopt New Data Broker and AI Rules
On Nov. 8, the California Privacy Protection Agency Board voted to adopt new rules for data brokers and advance a raft of proposed rules for privacy risk assessments, cybersecurity audits and the regulation of artificial intelligence technologies such as automated decision-making technology. Brandon Reilly, the leader of the privacy and data security practice at Manatt, Phelps & Phillips, spoke with The Recorder this week to discuss his takeaways from the board's decision and the potential impact of the proposed regulations on future litigation in the AI, privacy and cybersecurity spaces.
November 15, 2024 at 11:03 AM
8 minute read
Artificial IntelligenceWhat You Need to Know
- On Nov. 8, the California Privacy Protection Agency Board voted to adopt new rules for data brokers and advance a raft of proposed rules for privacy risk assessments, cybersecurity audits and the regulation of artificial intelligence technologies such as automated decision-making technology.
- The new data broker regulations seek to clarify provisions in the Delete Act, which requires data brokers to register with the CPPA, and the proposed rulemaking package aims to mandate cybersecurity audits and privacy risk assessments for certain businesses and grant consumers the right to access and opt out of businesses' use of ADMT.
- Brandon Reilly of Manatt, Phelps & Phillips shared his insights on the potential impact of the new regulations on future litigation.
On Nov. 8, the California Privacy Protection Agency Board voted to adopt new rules for data brokers, or businesses that collect and sell personal consumer data to third parties, and advance a raft of proposed rules for privacy risk assessments, cybersecurity audits and the regulation of artificial intelligence technologies such as automated decision-making technology.
ADMT, according to the CPPA, refers to “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decision-making.” ADMT can be used by businesses to engage in the practice of "profiling," or evaluating and predicting consumer traits such as their work performance, health or economic situation based on their personal data.
The CPPA, which first drafted these regulations a year ago, was established in November 2020 to enforce the California Consumer Privacy Act and the Delete Act, which awards California residents the right to request that data brokers delete their personal information. The new data broker regulations seek to clarify provisions in the Delete Act, which requires data brokers to register with the CPPA, while the proposed rulemaking package aims to mandate cybersecurity audits and privacy risk assessments for certain businesses and grant consumers the right to access and opt out of businesses' use of ADMT.
The data broker regulations, pending approval by the Office of Administrative Law, will go into effect on Jan. 1, 2025, and the proposed rulemaking package on AI, cybersecurity and privacy will move to a 45-day formal public comment period following the CPPA's vote.
Brandon Reilly, the leader of the privacy and data security practice at Manatt, Phelps & Phillips, spoke with The Recorder this week to discuss his takeaways from the board's decision and the potential impact of the proposed regulations on future litigation in the AI, privacy and cybersecurity spaces.
The following has been edited for length and clarity.
How do these proposed rules depart from or build on previous iterations of AI and cybersecurity regulations, for example, the sweeping AI regulation bill in California, Senate Bill 1047, that Gov. Gavin Newsom vetoed in September?
That bill was focused really directly on developers. So you can think of a lot of actual and proposed AI regulations as focused on what one or two audiences. … The first would be a developer, so a company that develops an AI model or a platform. And the second is generally called deployers, so any business or organization that looks to deploy an AI system or to implement it in their existing products or services. So the bill that the governor vetoed was intended to directly regulate developers. And, for that reason, I think there was a lot more focus on the impact on innovation, because so many of those developers are U.S. firms and many of them are California-based firms.
The CCPA's general applicability is on any business, any for-profit business that collects personal information from California residents. So that really is the universe of entities, which is quite large, that can be regulated by these [rules]. … So it is, in some ways, more targeted because it's within the universal CCPA regulation, but in many ways it's actually more expansive than the bill that was just vetoed, because that was really on firms that were developing AI models.
How do you foresee these regulations potentially affecting litigation in this space?
The litigation is going to follow the trajectory of litigation under the CCPA in general, which is actually somewhat limited, and that's because there is no private right of action under the CCPA, well, I should say that there's a limited private right of action that's focused only on data breaches. Everything else in the CCPA, which is just an incredibly extensive and comprehensive list of data privacy-related requirements—everything from privacy notices to data subject rights, like the rights to delete, to regulation in terms of advertising and data sales—all of those obligations, as they exist now, do not have a private right of action. So a violation of those requirements is solely enforceable by the CPPA and also the California Office of the Attorney General.
The data breach provision, which does authorize litigation directly … has been tested recently. Originally, it was only used in the context of the traditional data breach. And we are seeing plaintiffs start to more creatively attempt to use that private act of action so that it applies to what we would consider more of a privacy violation—things like allegedly unauthorized website tracking. … So the result is that these regulations, as well, would not have a private right of action. If you violate them, [they're] not going to be subject to a direct right of action. The way that it can become relevant in litigation is you do see litigants assert a claim under the California Business Code Section 17200 [the Unfair Competition Law], because that authorizes claims for underlying violations of the law, even if they don't have private right of action. … That may well happen with these regulations.
I would say another piece of this, and I think this is one of the most impactful aspects of these proposed regulations, is that they're asking regulated businesses to file bridge reports of the impact assessments that they're due … and then, on request, businesses must provide a full-on bridge impact assessment. And so what that means is that, of course, it's going to increase visibility, significantly, of the regulator into each regulated business's data practices. It's also possible that these reports could be discovered in civil discovery or subject to a subpoena. And that's certainly what the most creative plaintiffs … would do.
How are you advising clients to proceed in the wake of these regulations being approved?
We do have the benefit of time, because there is still this formal rulemaking process. … In response to the public comment period, there may be further changes with those regulations. And the dialogue between the board members the other day on Friday suggests that there will be some revisions. … So that needs to take place. There are financial impact reviews that need to happen.
California has a robust regulatory procedure law in place that this has to go through. So there is some time. But it's never too early for companies to start thinking about this. In fact, what a lot of companies are facing already are similar [international requirements, for example, the privacy impact assessments required by the European Union's General Data Protection Regulation].
So for those international businesses … some of these requirements are going to be familiar to them. They're not the same thing, but we are starting to work with regulated businesses on thinking about how this might change coexisting governance processes that they have in place to meet those international regulations. I'd also note that there are … state regulations, as well, in other states beyond California that require these privacy impact assessments. The difference is that in those statutes, the impact assessments are not a question. So it's really a bare-bones obligation. …
So what we're working with companies on and will be unveiling in the next several months is products that will help them in their governance solution. … Many companies that have to roll this out at scale benefit from having a tried-and-true, set [procedure]. And that's what they're starting to build.
What other takeaways do you have from the board's decision?
I think … the public reporting aspect of this is quite new, and it's the kind of requirement that is going to really catch the attention of risk officers, compliance officers, legal officers simply because it requires that certification of compliance. It requires the disclosure of a certain level of information. …
I think, too, the other headline here is that privacy laws continue to be the front line of regulating AI. And that's not to say that privacy law is the only way to regulate AI technologies, that's certainly not true. But for, I think, a variety of reasons, existing state privacy laws have really facilitated and enabled states to regulate AI systems somewhat quickly. And I think that's the case for a few reasons. I think one is that these laws are all relatively new anyway, and so [they have] the attention of lawmakers and regulators.
But two, a lot of the risk controls that exist for privacy in general—so things like those … impact assessments … are easily translatable to attempts to regulate AI technology. And so I think what we're seeing is regulatory bodies and statutory frameworks that are set up to regulate data generally have proven to be an attractive model for AI regulation.
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View All'Innovation Over Regulation': Tech Litigators and Experts Share Insights on the Future of AI, Data Privacy and Cybersecurity Under Trump
Russia-Linked Deepfakes Are Hitting the US Election. Will It Spur Congress to Act?
Is International Regulation of AI Moving in the Right Direction or Moving at All?
4 minute readTrending Stories
- 1Shareholder Activists Poised to Pounce in 2025. Is Your Board Ready?
- 2The Pennsylvania Superior Court and the Wrong Business
- 3On the Move and After Hours: Cole Schotz; Genova Burns; Sarno da Costa; Scarinci Hollenbeck
- 4IRE Physicians Must Consider All Conditions 'Due to' a Work Injury
- 5Social Media Policy for Judges Provides Guidance in a Changing World
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250