As public discourse shifts to the role private companies have in police surveillance, tech companies are limiting or abandoning the sale of their facial recognition software.

Last month, Amazon announced a one-year moratorium on selling its facial recognition tech to police agencies. IBM stopped selling its general purpose facial recognition and analysis software and Microsoft declared it will not sell facial recognition technology to U.S. police departments until there is a national law "grounded in human rights."

Amazon wrote that the one-year pause "might give Congress enough time to implement appropriate rules," while IBM echoed a similar call for a "national dialogue" on if and how that technology should be employed by domestic law enforcement.

To be sure, the tech industry has lobbied diligently to shape national and state tech laws. But as a broader discussion concerning police reform and tech surveillance develops, software companies and their lawyers say they welcome providing more transparency. Still, they note that regulatory burdens should match the tech's risk.

"I think a one-size-fits-all approach is not going to be practical," said Andrew Burt, chief legal officer of automated data governance platform Immuta and managing partner of boutique law firm bnh.ai. "I think you need to build in some flexibility so that the riskiest applications of this technology end up having the highest compliance burden. The less risky don't."

Former Microsoft chief privacy officer and Hintze Law partner Mike Hintze said more local governments banning facial recognition tech signals that law enforcement usage of software must meet a higher threshold than private usage, a notion most tech companies would agree to in a federal law.

"Focusing on a distinction of state use and private-sector usage I think companies are looking toward because the issues of government use, the police, ICE or other law enforcement agencies using this is very significant for obvious reasons," Hintze said. "With private-sector use when a company may use facial recognition to easily grant employees access, it's a different degree of use. Legislators should not lump all potential uses of facial recognition together, and think about the harms and the appropriate solution."

Still, while facial recognition software can be leveraged in minuscule and significant decisions, Burt noted a national law should require auditing and oversight mechanisms to foster accountability.

"Just setting some basic standards for ensuring transparency. What are they doing to ensure monitoring and are they allowing third parties to come in and inject some kind of outside expertise," he said.

Hintze noted Washington state recently passed a law regulating how law enforcement leverages facial recognition software, including requiring a warrant before usage and testing software for accuracy. Most tech companies would agree with those transparency terms if implemented nationally, Hintze said.

However, one regulation tech companies don't want to see repeated at a national scale is the Illinois Biometric Information Privacy Act, Hintze added. BIPA requires consent before a Illinois resident's biometrics are collected and disclosure of their policies around data usage and retention. If BIPA rights are violated, a private right of action is available.

"Illinois is the strictest. I think a lot of folks in the private sector feel they go too far with the strict consent," Hintze said. "And the combination of the private right of action has led to a lot of legal actions."

He added, "Companies are fine with putting out information about how their technologies are used and employed, but when you go as far as having explicit consent [for every use], it will create a lot of unworkable barriers."