X
Video cameras monitoring halls of a high school in Ohio.
MEGAN JELINGER/AFP via Getty Images

Facial recognition in schools: Even supporters say it won't stop shootings

The facial recognition industry is starting to see that its promises offer a false sense of security.

After a school shooting in Parkland, Florida left 17 people dead, RealNetworks decided to make its facial recognition technology available for free to schools across the US and Canada. If school officials could detect strangers on their campuses, they might be able to stop shooters before they got to a classroom.

Anxious to keep children safe from gun violence, thousands of schools reached out with interest in the technology. Dozens started using SAFR, RealNetworks' facial recognition technology.

From working with schools, RealNetworks, the streaming media company, says it's learned an important lesson: Facial recognition isn't likely an effective tool for preventing shootings. 

"The vast majority of school shootings are carried out by people that you wouldn't necessarily put on a watchlist, that you wouldn't be looking out for," said Mike Vance, SAFR's senior director of product management. "You have to know who you're looking out for."

If schools don't know who a likely shooter is, the company says, its software doesn't know who to find.

As the second anniversary of the Florida shooting approaches, the surveillance industry is facing the reality that facial recognition isn't well-equipped for the challenges of preventing school shootings. AI experts and privacy advocates have long argued this point. Now, facial recognition companies are starting to understand it, too.

Facial recognition companies have flocked to schools to pitch their products, often invoking tragedies such as Columbine, Sandy Hook and Parkland. Those pitches have been effective as school boards look to adopt facial recognition in their hallways, hoping the technology will help save students' lives. 

Facial Recognition 101

The education sector spent $2.7 billion on security systems in 2017, the year before the Parkland shooting, according to IHS Markit. The number is expected to grow by another $100 million by 2021. 

In New York, Lockport public schools spent roughly $1.4 million to install facial recognition technology at its schools. The money, which was spent on 300 cameras, servers and software, had initially been allocated for educational technology for students through the Smart Schools Bond Act, a New York bill passed in 2014. 

"Much to our dismay, school shootings continue to occur in our country. In many cases, these shootings involve students connected to the schools where these horrific incidents occur," Michelle Bradley, the superintendent of Lockport schools, said in a now-removed post on the district's website explaining the decision. "The Lockport city school district continues to make school security a priority."

The school district's facial recognition system went online in January, despite protests against the tool and pending state legislation on the technology calling for a moratorium until 2022. The district's board of education didn't respond to a request for comment. Several privacy advocates have called for a moratorium on facial recognition technology until its effects and capabilities have been fully researched.

Public opinion has also swung in favor of facial recognition in schools, despite little evidence of its effectiveness. In November, the privacy watchdog group ProPrivacy found that 54% of survey respondents said they were comfortable with using facial recognition in schools. 

rn-safr-register-2

SAFR started offering its facial recognition tools to schools for free after the Parkland school shooting in 2018.

RealNetworks

After working with schools, however, RealNetworks' Vance is concerned that facial recognition provides educators with a false sense of security. 

"You shouldn't say facial recognition can prevent school shootings because that's really overpromising," Vance said. "You still need to have good policy. You still need to engage with the community to make sure you know what's going on with the kids." 

Vance isn't the only security executive to come to that conclusion.

When Lisa Falzone, CEO of weapons detection company Athena Security, was researching how to use AI to help stop school shootings, she said she quickly saw issues with facial recognition. 

Her company's product is used by a school in Pennsylvania and a mosque in New Zealand. Athena Security is focused on detecting weapons rather than faces. That means it doesn't rely on a watchlist, like facial recognition does. 

"We didn't see facial recognition as a good way to solve the problem," Falzone said. "When we looked more into it, we saw a lot of privacy issues."

Not all facial recognition companies have reached the conclusion that RealNetworks and Athena have, though that may change as more data becomes available. 

Trueface is a facial recognition company whose technology is used in schools and on Air Force bases. But CEO Shaun Moore says the company has begun focusing on weapons detection capabilities, in part because of privacy concerns around detecting people's faces.

Moore says he hasn't seen enough data to decide whether facial recognition is or isn't an effective tool for stopping school shootings. But he's begun to think it may not be. 

"We're leaning toward that direction," Moore said, referring to facial recognition's limitations in stopping shootings. "But I still think that having facial recognition in play is an effective deterrent or arguably could prevent people from wanting to do harm."

False positives

Plenty of studies have raised concerns about flaws in facial recognition's accuracy, as well as its effects on privacy. 

Last December, the US National Institute of Standards and Technology found that a majority of facial recognition systems misidentified people of color more often than they did white people. The error rate ranged from 10 times to 100 times more often for people of color. 

If facial recognition is deployed at schools, the error rate suggests students of color would be falsely flagged more often than white students. 

Organizations such as Human Rights Watch, the American Civil Liberties Union and the Electronic Frontier Foundation have warned that facial recognition in schools threatens student privacy. US lawmakers have raised privacy concerns surrounding the technology, warning that facial recognition could have a chilling effect on free speech through its constant surveillance and identifying abilities. 

Another issue is that schools justify the installation of facial recognition technology by citing the prevention of shootings. Often, however, they use it for different purposes. 

The Texas City Independent School District adopted facial recognition after a Santa Fe High School shooting in 2018 left 10 people dead and 10 wounded.

"I will point out that not all safety initiatives are tied to school shooters, and facial recognition is much more than that," Melissa Tortorici, the school district's director of communications, said in a statement. "Saying that's the primary reason we adopted facial recognition is not accurate."

The school district has turned to AnyVision, an Israeli facial recognition company that primarily markets to schools, using shootings as a main selling point. In marketing material, it presented images from the Columbine, Sandy Hook and Parkland school shootings. 

anyvision.png

In marketing material tied to AnyVision, the company used images from multiple school shootings to promote its product. 

AnyVision / HPE

The company is being audited by Microsoft over its use of facial recognition on Palestinians after the tech giant invested in AnyVision as part of a $74 million funding round. The software giant is investigating if the company breached its promise to not use facial recognition in ways that violate human rights

AnyVision did not respond to a request for comment, and Microsoft said the audit was still going on.

Tortorici said the school district has used facial recognition to remove a student banned from campus but didn't disclose details of the incident. The school district had also used it to keep track of a fired employee who made threats to supervisors, Tortorici said. 

The system was later used by the district's executive director of security to ban a woman from campus with whom he had had an argument, according to Wired. After she called him an asshole during an argument at a school board meeting, the executive director placed the woman's image in the facial recognition system's watchlist. Tortorici said anyone who used "threatening language toward staff or students" would be added to the facial recognition system.

"This brings up all the issues of surveillance technologies and the normalization of being watched and tracked," said Brenda Leong, senior counsel and director of AI and ethics at the Future of Privacy Forum. "The more we do that, the harder it is for students to express themselves."

Face flat 

When facial recognition went live in the Lockport school district, Superintendent Bradley explained that it would match visitors against a database the district maintained. 

In order to maintain the privacy of students and staff, the database would only contain photos of a limited number of people who could be potential threats to schools. The database included staff who had been suspended, sex offenders and people prohibited from school grounds by court order, among others. 

The watchlist system is exactly why facial recognition would fall short in preventing school shootings, RealNetworks' Vance said. 

For the technology to work, the school would have to already know that someone might intend to attack the campus. But school shootings have historically been carried out in unpredictable ways that both technology and administrators wouldn't see coming. 

"I don't want to say it can never be used, but I think the majority of cases that you see, you just wouldn't predict that the person coming on campus would do anything wrong," Vance said. "We don't view this as a 'solution' to school shootings, we look at it as a school safety tool." 

Vance says facial recognition could make it easier for approved parents to pick up their kids after class or to determine day-to-day school challenges, like figuring out which students are involved in a fight in the parking lot.

He's found that schools have been primarily using facial recognition for logistics, not for preventing mass shootings, as companies advertised. 

Fear sells

Even when facial recognition's main selling point isn't as sturdy as it claims to be, it's still getting installed in schools across the country. Part of the appeal comes from the fear-driven marketing that facial recognition companies use, experts said. 

"It's a terrifying problem to know that somebody can break into your child's school and start shooting people," the Future of Privacy Forum's Leong said. "Most people don't know how facial recognition works, and they want to feel like they did something."

As companies scramble to sell their facial recognition technology, concerns mount that funds are being diverted from educational uses toward an unreliable technology. 

"For every camera system you're buying, you're not paying for counseling that might help that troubled youth figure out what's going on in their lives," said Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia and author of The Rise of Big Data Policing. "The $1 million is better spent on building an educational environment with a socio-emotional learning environment than a surveillance state."

Privacy advocates worry facial recognition is creating a surveillance state in schools, and they're not the only ones. The facial recognition companies are starting to see it, too. 

"Schools are supposed to be open environments, and they're supposed to feel welcoming," RealNetworks' Vance said. "Short of creating these mini prisons, that's just not the right environment that people are trying to create in education."