While more developers may be considering the wider social and legal implications of their technology, the question of "should it be built" still often takes a back seat to "build it, and we'll figure it out later."

Case in point: a new facial recognition app called Clearview. The app, which according to The New York Times was co-founded by an Australian developer and a former aide to Rudy Giuliani and funded in part by venture capitalist Peter Thiel, may be pushing the boundaries of "big brother" technology.

Clearview essentially works by matching an uploaded picture to a database of over 3 billion public photos scraped from a host of social media sites, including Facebook, YouTube and Venmo—making it one of the biggest known facial recognition databases around. Clearview, however, is only available to law enforcement agencies across the U.S., with the company telling the Times that around 600 such agencies have used the app in the past year to identify potential criminals.

To be sure, the Clearview's ability to match faces—even, as the Times notes, those that are partially obstructed—to a database of social media content isn't entirely novel.

Facebook employees, for instance, built a similar, since discontinued, facial recognition app for internal use around 2015 and 2016, according to Business Insider. And in 2011, then-Google executive chairman Eric Schmidt told the audience at a company conference that building a facial recognition database was possible, though due to privacy concerns, it was unlikely to be pursued.

Tech companies' hesitancy to build something similar to Clearview's repository may be tied not only to privacy concerns, but also potential legal liabilities a product like this could face. "Most likely there has been a risk assessment, and that risk assessment has concluded that engaging in this type of activity would be too much of a legal risk and a business risk," said Jarno Vanto, a partner at Crowell & Moring.

|

Evolving Case Law

But these legal risks largely depend on still evolving case law and privacy regulations. Because the law hasn't yet caught up with a service as expansive and potentially intrusive as Clearview's, there's uncertainty around if these risks would turn into real liabilities, or if anything would preclude a similar service from taking shape.

To address the legal concerns about the app, Clearview hired Paul D. Clement, a Kirkland & Ellis partner and former U.S. solicitor general under President George W. Bush, the Times noted. A memo written for customers by Clement argues that use of the app does not violate the constitution or state privacy and biometrics laws. The memo was provided to potential law enforcement agency customers, including the Atlanta police department.

Requests to discuss the legal issues surrounding the app were not returned by Clearview or Paul Clement by press time. Still, Kevin Coy, a partner in the privacy practice at Arnall Golden Gregory, noted the memo only focuses on the risk police departments face using Clearview.  "It's written for the law enforcement audience as for their ability to use the product. It doesn't really address Clearview's ability to offer the product or the potential legal liability Clearview may have."

Much of the liability facing Clearview will likely stem from how it created its database. Scraping photos from social media companies like Facebook violate their terms of service. While such scraping is widespread, Vanto noted "there are several legal bases" for countering it.

He pointed specifically to Southwest Airlines v. Roundpipe, where the U.S. District Court for the Northern District of Texas ruled in favor of Southwest Airlines' claims that website scraping by the defendant constituted a breach of contract and violated the Computer Fraud and Abuse Act (CFAA).

Still, Laura Jehl, the global head of McDermott Will & Emery's privacy and cybersecurity practice, noted that "courts have been kind of inconsistent" about applying the CFAA to scraping, "including the Ninth Circuit where most of the action has been." In HiQ Labs v. LinkedIn, for example, Jehl noted that the U.S. District Court for Northern District of California ruled that "scraping publicly available content is not a violation of the CFAA." LinkedIn, however, has singled it will ask the Supreme Court to review the ruling, and it is unclear whether courts would classify the data Clearview scrapes as "publicly available."

What's more, arguing for breach of contract may also be limited in its effectiveness. While a website could kick a user off for such a breach, it is difficult to prevent that user from coming back under different logins or identities. "They all know they're being scraped and don't have great recourse against it," Jehl said.

Of course, Clearview could also face copyright claims from social media users, who own the pictures they post and merely license them to the platform. But Jehl noted that "you don't often see a class action copyright case, but that's almost what you would need here." It would also be difficult, she added, for the plaintiffs to argue and quantified sufficient damages.

Clearview's liability aside, there may also be questions over whether police departments' use of the app is legally sound. Jehl, for instance, said she didn't agree with some parts of the memo by Kirkland's Clement. She specifically singled out the Clement's assertion that the Supreme Court's Carpenter ruling—which found that the collection of historical cell-site location information (CSLI) during criminal investigation constitutes a search under the Fourth Amendment—is too narrowly construed to restrict police department's use of Clearview.

"I don't think you can say the same court wouldn't have the same reaction to facial recognition technology that would allow anyone to identify who you are, and find out where you live, and where you work, and what you do just from having snapped your picture on the street," she said.

|

Getting Around Privacy Regulations

Whether a service like Clearview could face liability under privacy laws is also another open question. Coy notes there could be "potential exposure under [state] biometric privacy laws" such as Illinois Biometric Privacy Act (BIPA), though he adds that BIPA doesn't expressly address screen scraping. Such an app would also likely fall under local biometrics laws, such as one in San Francisco that bans police departments from using facial recognition technology outright.

Of course, Clearview could limit its use to jurisdictions without such laws. But it may be hard to avoid more expansive state privacy laws like the California Consumer Protection Act (CCPA) or ones that have international reach like the EU's General Data Protection Regulation (GDPR). "Both the GDPR and the CCPA define personal information very broadly," Vanto said, explaining that "if a photograph can be used to identify a person, then by definition that photo is personal information."

However, if Clearview made sure not to collect or process data belonging to any EU citizen—a potentially difficult task—it may then only face a few requirements under California's law, which is far less expansive than the GDPR.

Jehl notes that Clearview is "probably is a data broker under that CCPA companion law that got passed, because a data broker is a business that knowingly collects and sells to third parties personal information of the consumer with whom the business does not have a direct relationship." But so long as Clearview registers as a data broker, "I'm not sure what they would be in violation of," she added.

Still, Clearview wouldn't be entire off the hook. Under the CCPA, it would need to honor data access and deletion requests by California residents, Vanto said.

To be sure, Clearview isn't the only law enforcement tool on the market. Many police departments have used AI and facial recognition technology for years to help identify those who have committed crimes. Veritone Identify, for example, uses AI to match video and photographic content to  known offender databases. "We provide them with a list of possible matches that accelerate their investigatory process," said Jon Gacek , Veritone's head of government, legal and compliance.

He noted that when compared to Clearview, "We have taken a different approach to solve the same problem," adding that the approach Veritone takes "is very defensible." Ricciuti explained, "The major the difference between us and Clearview is we build our space upon known offenders, not a public database of information grabbed from public website."

Ricciuti also expressed concerns that size of Clearview's database could limit the app's accuracy. "The bigger the database, the more false positives you get, and at some point you get to a tipping point of lack of usefulness." In Clearview's customer FAQ, obtained by the Times, the company states its app has a 98.6% accuracy rate.

Whether any accuracy issues or legal liabilities materialize for Clearview remains to be seen. But one thing is certain: These courts will eventually need to deal with the issues posed by the app and its expansive database.

Discussing why this technology hasn't come out sooner, Jehl noted that "you're creating a genie you can't get back into." That genie is out now, and what comes next is anyone's guess.