Lockport, New York's planned use of facial recognition technology in its public schools continues to generate controversy.

The district acquired the technology, including the AEGIS facial matching system, last year, according to the New York Civil Liberties Union.

Last week, a NYCLU representative told Legaltech News that after the group objected “to deploying this surveillance technology without any privacy policy in place, the Lockport School District drafted a privacy policy, and the state education department has called for a privacy assessment.”

School officials could not be reached for comment by Legaltech News about their plans for the technology. School Superintendent Michelle Bradley told The Lockport Union-Sun & Journal in late December that the district had not decided when to implement its facial recognition software.

But the NYCLU appears to be continually scrutinizing the board's plan on the use of the technology.

“We are exploring legislative options,” the NYCLU representative said when asked about the group's next move. “New York state currently has no regulations to govern the use of this technology. The New York State Education Department and the state Legislature should prohibit its use in public schools so that students are not subjected to this invasive surveillance technology.”

Beyond the price tag, the NYCLU was concerned about such issues as:

  • Once an image is captured and uploaded, the system can track the person's movements around the school over the previous 60 days;
  • Students meeting with a counselor or school clinic staff will be put into the system;
  • The technology is allegedly “notoriously inaccurate, especially when it comes to identifying women, young people and people of color,” the NYCLU has said; and
  • Concerns over who will get access to the database.

“The Family Educational Rights and Privacy Act obligates schools to protect student educational records, and the use of this surveillance technology in schools raises serious questions about the readiness of schools to protect sensitive biometric information,” the NYCLU representative said.

In the school board's draft policy, it explains that Lockport is “responsible for protecting the overall safety and welfare of the district's students, staff, properties, and visitors as well as deter theft, violence and other criminal activity on District property.” The policy says it establishes “parameters for the operation of security systems and protection of privacy.”

The “input and maintenance of personally identifiable information in the district's security systems will be limited to individuals who present an immediate or potential threat to the safety of the school community,” the policy further explains. These may include students who have been suspended from school, staff who have been suspended or are on administrative leave, Level 2 or 3 sex offenders, and others.

Moreover, information from security systems “may be shared with law enforcement or other governmental authorities in response to an immediate threat or as required by law,” the policy adds.

The NYCLU is concerned that other school districts could also install such technology. One law professor, Andrew Guthrie Ferguson, who teaches at the University of the District of Columbia, is raising similar concerns.

“Facial recognition technology in schools is an example of 'security theater' gone wrong,” Ferguson told Legaltech News. “Not only will the technology be largely ineffective, but the … cost of spending that money on cameras and software rather than teachers and students borders on unconscionable. We all want schools to be safe, but to take advantage of the fear from recent school shootings and other tragedies to sell surveillance against students is not in anyone's interest.”

Such technology “is not ready for prime time,” Ferguson added. “The number of false matches, incomplete identifications, and other problems is large enough that school districts should demand large-scale testing before any purchase. And, worse, the companies know about the limitations but are using students as their training data to improve the computer models.”

In addition, Ferguson says because the system is aimed mostly at students, “I would be worried about how the images will be used in addition to school safety,” Ferguson added. “Who owns the data? What can be done with the student images? Companies know that facial recognition matches are the identifications of the future. Once you have an image and a name, you can track people in many more places. … So, I would be very cautious of having young people—many not legally able to consent—give up their images to companies and schools to monetize their data.”

NYCLU education counsel Stefanie Coyle said in a statement to Legaltech News, “The Lockport City School District should not have spent millions of dollars on invasive technology that will not make schools safer.”

“The draft policy—which should have been created with public input before the technology was installed—has no meaningful limits on sharing facial recognition information with law enforcement or 'governmental authorities,' which may cause some parents to be fearful about sending their children to school,” she said. “It establishes no protocols to guard against harm if the system produces a false match, and in too many situations it would allow the data to be stored longer than the proposed 60 days. The Lockport School District should allow for public comment on this policy, and take seriously the concerns of parents, students and community members before this kind of notoriously inaccurate and biased surveillance technology is let loose on children.”