A proposal issued in a recent report by the Australian Competition and Consumer Commission (ACCC) called for a new government regulator—the Digital Platforms Branch—that would review Google and Facebook algorithms in order to gain better insight into their business practices and identify any potential anti-competitive behavior.

It’s a move whose implications may stretch well outside Australian borders, especially as concerns continue to be raised over biases inherent to algorithms powering AI applications. However, a regulatory review of algorithms is not without its complications, and for some, the reward of uncovering algorithms might not be large enough to offset both the leg work involved and the potentially unnecessary risk to both regulators and the tech companies they oversee.

For starters, tech platforms are unlikely to want to hand over information that might constitute trade secrets without a fight, or at least some substantial protections put into place. The notion isn’t entirely without precedent in the U.S., where the government has the ability request proprietary information from companies in the environmental or medical device realms, for example.

“But in doing so, a company always evaluates how to disclose this information to the government in a manner that is most protective and would not result in the government in turn releasing it publicly,” said Myriah Jaworski, a member at Beckage.

Some of the terms governing that information, such as the handling, storage or retention period, are negotiated on a case-by-case basis. Paige Boshell, a managing member at Privacy Counsel, suggested that when it comes to tech companies and their algorithms, the information may not leave be permitted to leave corporate headquarters, requiring regulators interested in taking a peek under the hood come to the source.

Those conditions could possibly be more favorable to a government seeking to review an algorithm, as opposed to having to take responsibility for securing that proprietary information within its own systems.

“You hear in the media every day about our local city and state governments and the federal government, the challenges in cybersecurity that they are facing with older legacy systems,” Boshell said.

Still, the burden for regulators wouldn’t necessarily begin and end with cybersecurity. Jaworski pointed out that a platform like Facebook typically changes its algorithms at least once a year, if not more. Attempting to consistently review algorithms flowing from a handful of the larger tech companies is one thing, but if a government were to attempt an expansion into companies dabbling in AI, for example, the practice could turn cumbersome.

So what’s the alternative? Citing the cybersecurity standards posed by various states as a model, Boshell could envision a paradigm for algorithms driven by predetermined guidelines rather than review.

“One of the reasons that we see that in cybersecurity is that the tech is so completely evolving faster than the laws. And arguably the same is true of data science,” she said.

Even then, Boshell noted that if scientists were to consult with regulators on a set of standards for algorithms, the same evolutionary speed could render them obsolete within six months.

But Jaworski thinks that companies would voluntarily engage with guidelines rather than disclose proprietary algorithms.

“I’m not even sure on a higher level that we need to be going towards algorithm disclosure. … While [algorithms] can have some unintended harmful consequences, the way you can mitigate those consequences can be through the use of guidelines and principles rather than government approval of the algorithm themselves,” Jaworski said.