Risky Business: Should Governments Be Reviewing Tech Companies' Algorithms?
There's precedent for governments obtaining trade secrets. But in the case of tech platforms and their algorithms, there could be easier ways to keep companies from hiding potentially troublesome activity.
August 06, 2019 at 11:30 AM
3 minute read
A proposal issued in a recent report by the Australian Competition and Consumer Commission (ACCC) called for a new government regulator—the Digital Platforms Branch—that would review Google and Facebook algorithms in order to gain better insight into their business practices and identify any potential anti-competitive behavior.
It’s a move whose implications may stretch well outside Australian borders, especially as concerns continue to be raised over biases inherent to algorithms powering AI applications. However, a regulatory review of algorithms is not without its complications, and for some, the reward of uncovering algorithms might not be large enough to offset both the leg work involved and the potentially unnecessary risk to both regulators and the tech companies they oversee.
For starters, tech platforms are unlikely to want to hand over information that might constitute trade secrets without a fight, or at least some substantial protections put into place. The notion isn’t entirely without precedent in the U.S., where the government has the ability request proprietary information from companies in the environmental or medical device realms, for example.
“But in doing so, a company always evaluates how to disclose this information to the government in a manner that is most protective and would not result in the government in turn releasing it publicly,” said Myriah Jaworski, a member at Beckage.
Some of the terms governing that information, such as the handling, storage or retention period, are negotiated on a case-by-case basis. Paige Boshell, a managing member at Privacy Counsel, suggested that when it comes to tech companies and their algorithms, the information may not leave be permitted to leave corporate headquarters, requiring regulators interested in taking a peek under the hood come to the source.
Those conditions could possibly be more favorable to a government seeking to review an algorithm, as opposed to having to take responsibility for securing that proprietary information within its own systems.
“You hear in the media every day about our local city and state governments and the federal government, the challenges in cybersecurity that they are facing with older legacy systems,” Boshell said.
Still, the burden for regulators wouldn’t necessarily begin and end with cybersecurity. Jaworski pointed out that a platform like Facebook typically changes its algorithms at least once a year, if not more. Attempting to consistently review algorithms flowing from a handful of the larger tech companies is one thing, but if a government were to attempt an expansion into companies dabbling in AI, for example, the practice could turn cumbersome.
So what’s the alternative? Citing the cybersecurity standards posed by various states as a model, Boshell could envision a paradigm for algorithms driven by predetermined guidelines rather than review.
“One of the reasons that we see that in cybersecurity is that the tech is so completely evolving faster than the laws. And arguably the same is true of data science,” she said.
Even then, Boshell noted that if scientists were to consult with regulators on a set of standards for algorithms, the same evolutionary speed could render them obsolete within six months.
But Jaworski thinks that companies would voluntarily engage with guidelines rather than disclose proprietary algorithms.
“I’m not even sure on a higher level that we need to be going towards algorithm disclosure. … While [algorithms] can have some unintended harmful consequences, the way you can mitigate those consequences can be through the use of guidelines and principles rather than government approval of the algorithm themselves,” Jaworski said.
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllTrending Stories
- 1Sterlington Brings On Former Office Leader From Ashurst
- 2DOJ Takes on Largest NFT Scheme That Points to Larger Trend
- 3Arnold & Porter Matches Market Year-End Bonus, Requires Billable Threshold for Special Bonuses
- 4Advising 'Capital-Intensive Spaces' Fuels Corporate Practice Growth For Haynes and Boone
- 5Big Law’s Year—as Told in Commentaries
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250