The recent disclosure that the FBI and ICE have been searching state DMV databases should alarm everyone. According to the Washington Post, researchers have determined that FBI and immigration officials routinely use facial recognition software to search millions of DMV photographs, and most of these searches are done merely on the basis of an email from a federal agent to a local contact without any other oversight. There are over 641 million photographs nationwide in local, state and federal databases, which gives police the ability to compare photos to almost everyone in the United States with little to no guidelines or protections. This is frightening.

Although historically police have had access to fingerprints, DNA or photos for searches, including facial recognition, that data was composed primarily of information obtained in the course of criminal arrests or investigations. In theory at least, there should have been probable cause or other appropriate reason to obtain those fingerprints, DNA, or photos, and the inclusion of that data would have been transparent and subject to review.

The addition of DMV databases, however, opens up the range of potential “suspects” to the vast majority of adult Americans for searches that are largely unregulated and without consent of those who have been photographed, most of whom have never been charged with a crime. In fact, a congressional hearing held in May on this issue highlighted the lack of legislative approval at the state or federal level before the FBI or ICE were allowed to access the state DMV databases. For states that encourage undocumented immigrants to obtain driver’s licenses or other identification in particular, it is an egregious breach of trust to then turn around and provide those images to ICE.

Beyond the obvious privacy and civil rights implications, the technology itself is only about 86% accurate. The software’s success rate is dependent on a number of factors, including the lighting of the subject’s face, the percentage of the face that is visible and the image quality. Notably, the software is strikingly less accurate with respect to persons of color but is being applied across the board.

Even more concerning is the potential for abuse. A recently released report described an incident in which the New York Police Department submitted a low-quality surveillance photo from a robbery to its facial recognition system. When it failed to return any matches, the investigating officers submitted a high-quality photo of the actor Woody Harrelson because the surveillance photo looked a little like Harrelson. The system then returned a match (not Harrelson) who was later arrested for larceny. The report noted that the use of alternate photos, including unrelated celebrities, computer-generated faces, or composite sketches, is becoming more common. Not surprisingly, such images frequently return inaccurate results.

Once a match is found, there is limited guidance as to what law enforcement action may be taken. Is the match alone enough to arrest someone for the commission of a crime? Maybe not, but we seem to be getting perilously close to that line. Although most law enforcement does not (yet) consider facial recognition a positive identification for arrest purposes, the report noted instances in which suspects were apprehended and placed in a lineup solely on the basis of a facial recognition result or in which a suspect’s photo was presented singly to witnesses, rather than in sequence, with the question “is this the guy?” Given what is already known about the bias that comparative photo arrays can introduce, this technology appears likely to lead to misidentifications.

Facial recognition technology is evolving rapidly and has already been commercially marketed by Amazon for use in police departments across the country. Before it spreads, we urge lawmakers and law enforcement to pause and put together a framework that will protect civil liberties and accurately reflect the potential inaccuracies in this technology. In particular, we note the pitfalls of relying on technology with documented inaccuracies for persons of color. Regulations should be developed to standardize photo quality, restrict the use of composite sketches or alternate celebrity images, and provide specific guidance as to law enforcement action that may be taken on the basis of facial recognition.

Technology moves at warp speed. Lawmakers need to act now. We are already behind.