A lawsuit against the New York State Education Department is looking to dismantle a $3 million facial recognition system in schools, citing student privacy concerns and the technology's issues with racial and gender bias.
On Monday, the New York Civil Liberties Union sued the education department over its approval to install facial recognition for the Lockport School District, one of the first US public school systems to use the technology on students and staff. The department approved the facial recognition installation last November, and it was activated in January.
The lawsuit (pdf), filed on behalf of parents of Lockport students, argues that the facial recognition system used at Lockport's schools retains biometric data on students and violates the state's privacy protections under New York's Education Law. The state's education department originally approved facial recognition for Lockport's schools because it believed the technology would properly protect students' privacy.
The NYCLU and parents of Lockport students are seeking to have the system deactivated and removed from schools. The education department said it does not comment on pending litigation.
"The Lockport facial recognition surveillance system was the product of a Board of Education falling for the sweet talk of a salesman who misrepresented himself as an independent security expert," said Jim Shultz, a Lockport parent and plaintiff in the suit. "Neither the school district or state education officials gave a thought to the radical impact this would have on student privacy."
Schools have increasingly turned to technology as solutions for student safety -- using tools like facial recognition and monitoring social media to seek out potential threats. Several facial recognition companies had marketed their technology as ways to detect intruders and stop shooting threats.
Lockport schools superintendent Michelle Bradley told parents that it was installing facial recognition to deal with school shootings in 2019, and the district has since spent more than $3 million on security upgrades, nearly half of which was for facial recognition. That money comes from New York's Smart Schools Bond Act, which was supposed to fund educational technology.
In court documents, parents expressed frustration that the school district spent millions on facial recognition rather than on funds to help students connect online during the coronavirus pandemic.
The contagious disease has forced students to attend classes online, but that's been a challenge for students without access to stable internet or compatible devices.
"Lockport should have spent the funds it received to purchase and install a face recognition system on actual educational programs and instructional technology," said Renee Cheatham, a plaintiff in the lawsuit and a Lockport parent. "Neighboring districts invested their Smart Schools Bond Act money in iPads and faster internet, while Lockport bought spy cameras."
A CNET investigation found that facial recognition providers don't believe their products can effectively prevent school shootings, even as the industry flocks to sell this technology to educators.
The NYCLU is hoping to take this technology out of the hallways for a school district that has approximately 4,600 students from kindergarten to 12th grade.
Lockport's schools were approved for facial recognition by arguing that the technology doesn't have a database of every single student -- and therefore did not have any data that could violate the state's student privacy laws. Their facial recognition worked by having a database of known threats -- which could include sex offenders or suspended staff.
The state education department had approved of the system once the school district decided to remove any students from that database, according to the lawsuit. But it failed to account for false matches with the known threats list.
While it doesn't keep the photos of every student that it scans, the facial recognition system at Lockport's schools is constantly scanning everyone, from children as young as 5 to high school seniors, according to the complaint. Whenever it makes an identification -- whether or not it's a proper match -- it retains the data from that facial recognition scan.
The lawsuit notes that misidentification is a serious concern with facial recognition, as the technology is known to have gender and racial bias. More than a quarter of the school district's students are people of color, and half of the students are female, the lawsuit said.
"Despite district claims that the system will not catalog students, the technology and nature of the data collected do not allow for students to remain anonymous," said Beth Haroules, an NYCLU senior staff attorney. "The software is also inaccurate and is especially likely to misidentify women, young people, and people of color, disproportionately exposing them to the risks of misidentification and law enforcement."