Facial recognition presents privacy concerns
A patchwork of laws regulate technology that mines faces for data
February 29, 2012 at 07:00 PM
6 minute read
In a scene in the 2002 movie “Minority Report,” set in 2054, Tom Cruise's character walks through a subway station lined with camera lenses and talking digital billboards. “John Anderton!” one of them calls out to him. “You could use a Guinness right about now.”
Such a setup will be technologically feasible much sooner than 2054. Already, digital billboards and signs scan faces to gather demographic information for marketing purposes. A wide array of companies, from Adidas to Whole Foods, has rolled out systems that predict gender and age and then deliver targeted ads. Intel says its AIM (Audience Impression Metrics) suite of software, which powers the Adidas and Whole Foods systems, can predict gender with 94 percent accuracy and age range with 90 percent accuracy.
Facebook, whose users upload around 200 million photos every day, boasts perhaps the widest reach of facial recognition software, which, unlike facial detection software, matches images to identities and tends to store data, therefore presenting clearer privacy concerns. The social network uses the technology to recognize faces in photos and suggest who should be tagged in them.
As databases, algorithms and camera lenses improve, facial recognition accuracy is growing. In facial recognition tests performed in controlled settings by the National Institute of Standards and Technology, false-negative rates fell from 79 percent in 1993 to .003 percent in 2010, halving every two years.
And researchers currently are developing algorithms to identify facial expressions and moods, so it really isn't a leap to imagine a future in which a personalized billboard could identify you.
Power of Information
In 2009, researchers at Carnegie Mellon University (CMU) started out with webcam images of people on campus and off-the-shelf facial recognition software. Using publicly available Facebook photos, they were able to match a third of the people in those images with their Facebook profiles, revealing personal information including names, dates of birth, workplaces, schools, friends, sexual orientation and interests. Then they took it a step further: Using that information, the researchers were able to use an algorithm to predict the first five digits of the person's social security number with 27 percent accuracy.
Their point? When facial recognition, social networks, data mining and cloud computing converge, it's possible—even within the limitations of today's technology—to start with an anonymous face and end up with highly sensitive personal information.
“Privacy is much less about control of a person's information and much more about the control that others can have over you if they have sufficient information about you,” said Alessandro Acquisti, one of the CMU researchers, at a December 2011 Federal Trade Commission (FTC) forum on the development of facial recognition technology and its privacy concerns.
The possibilities that the CMU studies presented have many privacy-watchers concerned that, left unharnessed, facial recognition technology will destroy anonymity in public spaces and raise privacy and security concerns over the collection, sharing and storage of data. When it comes to best addressing these risks, the FTC is poised to frame the conversation going forward.
“The nature of what we can capture about individuals is expanding faster than our ability to think about whether it's prudent to do so,” says Martin Abrams, president of the Centre for Information Policy Leadership at Hunton & Williams, a Washington, D.C. think tank.
Patchwork Privacy
Citing the difficulty of creating black-letter law for emerging and rapidly developing technologies, industries that use facial detection and recognition are pushing for self-regulation.
While facial recognition would likely be addressed in a federal omnibus data privacy bill—several of which have been introduced in Congress in recent years—with more immediate problems looming, it will likely be years before one passes.
In its absence is a patchwork of state laws, such as ones in Illinois and Texas that address the privacy of biometrics, and industry-specific laws that govern the use of personal information, such as the banking industry's Gramm-Leach-Bliley Act and the Fair Credit Reporting Act.
“There's nothing in any federal law right now that would prevent a party from taking a picture of someone in public and running it through facial recognition software,” says Craig Hoffman, a partner in Baker Hostetler's Privacy, Security & Social Media group.
The FTC's enforcement powers under Section 5 of the FTC Act allows it to bring complaints alleging unfair and deceptive trade practices against organizations that sign on to industry codes of conduct and don't abide by them. But companies that don't agree to such codes would effectively face no regulatory or legal risk.
“Most likely, what you would expect the FTC to say is what they're saying about the online advertising industry: For now, self-regulation is working, but we encourage you to create some industry codes of conduct that you follow and enforce,” Hoffman says.
The Digital Signage Federation has adopted Center for Democracy and Technology (CDT)-written privacy standards that incorporate the eight Fair Information Practice Principles. They call for an opt-out—or in most cases, a walk-away—approach to facial detection and an informed opt-in approach to the thornier issue of facial recognition.
But as he described the standards at the FTC workshop, CDT Policy Counsel Harley Geiger conceded, “As everybody in the room, I'm sure, knows, self-regulation—it's very difficult to actually have strong enforcement and strong accountability.”
Reputational Risk
Organizations do face the risk of alienating their customer base through secrecy and intrusion, which can give rise to the “creep factor” when they cross the line into what the marketplace deems to be creepy.
“The use of visual images increases the risk that we will use information in an inappropriate way, that we will use information and predictive models that are unreliable, and that we will suffer not compliance risk but more reputational risk,” Abrams says.
Pam Dixon, founder and executive director of the World Privacy Forum, a public interest group, says that inappropriate uses are creeping up on consumers because they don't yet recognize how powerful a simple faceprint can be.
“It really sounds like Facebook is building an enormous photo/facial recognition database, and Google's in the same boat,” Dixon says. “A face is like a fingerprint. It's a unique biometric that's yours for life, and we have to be a lot more careful about how we're handing these things out and how we're using them—if Facebook or Google asked us for a fingerprint to be kept on file, a lot of Americans would say, 'Forget it.'”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllCrypto Industry Eyes Legislation to Clarify Regulatory Framework
SEC Official Hints at More Restraint With Industry Bars, Less With Wells Meetings
4 minute readTrump Fires EEOC Commissioners, Kneecapping Democrat-Controlled Civil Rights Agency
Trending Stories
Who Got The Work
J. Brugh Lower of Gibbons has entered an appearance for industrial equipment supplier Devco Corporation in a pending trademark infringement lawsuit. The suit, accusing the defendant of selling knock-off Graco products, was filed Dec. 18 in New Jersey District Court by Rivkin Radler on behalf of Graco Inc. and Graco Minnesota. The case, assigned to U.S. District Judge Zahid N. Quraishi, is 3:24-cv-11294, Graco Inc. et al v. Devco Corporation.
Who Got The Work
Rebecca Maller-Stein and Kent A. Yalowitz of Arnold & Porter Kaye Scholer have entered their appearances for Hanaco Venture Capital and its executives, Lior Prosor and David Frankel, in a pending securities lawsuit. The action, filed on Dec. 24 in New York Southern District Court by Zell, Aron & Co. on behalf of Goldeneye Advisors, accuses the defendants of negligently and fraudulently managing the plaintiff's $1 million investment. The case, assigned to U.S. District Judge Vernon S. Broderick, is 1:24-cv-09918, Goldeneye Advisors, LLC v. Hanaco Venture Capital, Ltd. et al.
Who Got The Work
Attorneys from A&O Shearman has stepped in as defense counsel for Toronto-Dominion Bank and other defendants in a pending securities class action. The suit, filed Dec. 11 in New York Southern District Court by Bleichmar Fonti & Auld, accuses the defendants of concealing the bank's 'pervasive' deficiencies in regards to its compliance with the Bank Secrecy Act and the quality of its anti-money laundering controls. The case, assigned to U.S. District Judge Arun Subramanian, is 1:24-cv-09445, Gonzalez v. The Toronto-Dominion Bank et al.
Who Got The Work
Crown Castle International, a Pennsylvania company providing shared communications infrastructure, has turned to Luke D. Wolf of Gordon Rees Scully Mansukhani to fend off a pending breach-of-contract lawsuit. The court action, filed Nov. 25 in Michigan Eastern District Court by Hooper Hathaway PC on behalf of The Town Residences LLC, accuses Crown Castle of failing to transfer approximately $30,000 in utility payments from T-Mobile in breach of a roof-top lease and assignment agreement. The case, assigned to U.S. District Judge Susan K. Declercq, is 2:24-cv-13131, The Town Residences LLC v. T-Mobile US, Inc. et al.
Who Got The Work
Wilfred P. Coronato and Daniel M. Schwartz of McCarter & English have stepped in as defense counsel to Electrolux Home Products Inc. in a pending product liability lawsuit. The court action, filed Nov. 26 in New York Eastern District Court by Poulos Lopiccolo PC and Nagel Rice LLP on behalf of David Stern, alleges that the defendant's refrigerators’ drawers and shelving repeatedly break and fall apart within months after purchase. The case, assigned to U.S. District Judge Joan M. Azrack, is 2:24-cv-08204, Stern v. Electrolux Home Products, Inc.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250