In a scene in the 2002 movie “Minority Report,” set in 2054, Tom Cruise's character walks through a subway station lined with camera lenses and talking digital billboards. “John Anderton!” one of them calls out to him. “You could use a Guinness right about now.”

Such a setup will be technologically feasible much sooner than 2054. Already, digital billboards and signs scan faces to gather demographic information for marketing purposes. A wide array of companies, from Adidas to Whole Foods, has rolled out systems that predict gender and age and then deliver targeted ads. Intel says its AIM (Audience Impression Metrics) suite of software, which powers the Adidas and Whole Foods systems, can predict gender with 94 percent accuracy and age range with 90 percent accuracy.

Facebook, whose users upload around 200 million photos every day, boasts perhaps the widest reach of facial recognition software, which, unlike facial detection software, matches images to identities and tends to store data, therefore presenting clearer privacy concerns. The social network uses the technology to recognize faces in photos and suggest who should be tagged in them.

As databases, algorithms and camera lenses improve, facial recognition accuracy is growing. In facial recognition tests performed in controlled settings by the National Institute of Standards and Technology, false-negative rates fell from 79 percent in 1993 to .003 percent in 2010, halving every two years.

And researchers currently are developing algorithms to identify facial expressions and moods, so it really isn't a leap to imagine a future in which a personalized billboard could identify you.

Power of Information

In 2009, researchers at Carnegie Mellon University (CMU) started out with webcam images of people on campus and off-the-shelf facial recognition software. Using publicly available Facebook photos, they were able to match a third of the people in those images with their Facebook profiles, revealing personal information including names, dates of birth, workplaces, schools, friends, sexual orientation and interests. Then they took it a step further: Using that information, the researchers were able to use an algorithm to predict the first five digits of the person's social security number with 27 percent accuracy.

Their point? When facial recognition, social networks, data mining and cloud computing converge, it's possible—even within the limitations of today's technology—to start with an anonymous face and end up with highly sensitive personal information.

“Privacy is much less about control of a person's information and much more about the control that others can have over you if they have sufficient information about you,” said Alessandro Acquisti, one of the CMU researchers, at a December 2011 Federal Trade Commission (FTC) forum on the development of facial recognition technology and its privacy concerns.

The possibilities that the CMU studies presented have many privacy-watchers concerned that, left unharnessed, facial recognition technology will destroy anonymity in public spaces and raise privacy and security concerns over the collection, sharing and storage of data. When it comes to best addressing these risks, the FTC is poised to frame the conversation going forward.

“The nature of what we can capture about individuals is expanding faster than our ability to think about whether it's prudent to do so,” says Martin Abrams, president of the Centre for Information Policy Leadership at Hunton & Williams, a Washington, D.C. think tank.

Patchwork Privacy

Citing the difficulty of creating black-letter law for emerging and rapidly developing technologies, industries that use facial detection and recognition are pushing for self-regulation.

While facial recognition would likely be addressed in a federal omnibus data privacy bill—several of which have been introduced in Congress in recent years—with more immediate problems looming, it will likely be years before one passes.

In its absence is a patchwork of state laws, such as ones in Illinois and Texas that address the privacy of biometrics, and industry-specific laws that govern the use of personal information, such as the banking industry's Gramm-Leach-Bliley Act and the Fair Credit Reporting Act.

“There's nothing in any federal law right now that would prevent a party from taking a picture of someone in public and running it through facial recognition software,” says Craig Hoffman, a partner in Baker Hostetler's Privacy, Security & Social Media group.

The FTC's enforcement powers under Section 5 of the FTC Act allows it to bring complaints alleging unfair and deceptive trade practices against organizations that sign on to industry codes of conduct and don't abide by them. But companies that don't agree to such codes would effectively face no regulatory or legal risk.

“Most likely, what you would expect the FTC to say is what they're saying about the online advertising industry: For now, self-regulation is working, but we encourage you to create some industry codes of conduct that you follow and enforce,” Hoffman says.

The Digital Signage Federation has adopted Center for Democracy and Technology (CDT)-written privacy standards that incorporate the eight Fair Information Practice Principles. They call for an opt-out—or in most cases, a walk-away—approach to facial detection and an informed opt-in approach to the thornier issue of facial recognition.

But as he described the standards at the FTC workshop, CDT Policy Counsel Harley Geiger conceded, “As everybody in the room, I'm sure, knows, self-regulation—it's very difficult to actually have strong enforcement and strong accountability.”

Reputational Risk

Organizations do face the risk of alienating their customer base through secrecy and intrusion, which can give rise to the “creep factor” when they cross the line into what the marketplace deems to be creepy.

“The use of visual images increases the risk that we will use information in an inappropriate way, that we will use information and predictive models that are unreliable, and that we will suffer not compliance risk but more reputational risk,” Abrams says.

Pam Dixon, founder and executive director of the World Privacy Forum, a public interest group, says that inappropriate uses are creeping up on consumers because they don't yet recognize how powerful a simple faceprint can be.

“It really sounds like Facebook is building an enormous photo/facial recognition database, and Google's in the same boat,” Dixon says. “A face is like a fingerprint. It's a unique biometric that's yours for life, and we have to be a lot more careful about how we're handing these things out and how we're using them—if Facebook or Google asked us for a fingerprint to be kept on file, a lot of Americans would say, 'Forget it.'”