Those results are troubling for the facial recognition industry, which has been scrambling to develop algorithms that can identify people through their eyes and nose alone as people turn to face masks amid the coronavirus pandemic.
Facial recognition algorithms rely on getting as many data points on a person's image as possible, and face masks tend to take away a lot of valuable identifying information. The algorithms are already finicky enough that improper lighting or a bad angle can fool the technology, and masks make matters worse, the study found.
One algorithm that would have an error rate of 0.3% surged to 5% when presented with images of people wearing masks, the study found. The study tested the effectiveness of 89 facial recognition algorithms against face masks.
The test looked at the algorithms' "one-to-one" matching capabilities -- essentially comparing one photo of a person to a different picture, but with a mask on. NIST used 6 million images for its research, and applied masks digitally, with different variations of the coverings.
The study also found that the more of the nose that was covered, the more likely the mask was to stymie the algorithms. Black masks were also more likely to fool the algorithms than blue ones, the research showed.
NIST said this was the first of a series of tests for facial recognition and face masks. The agency plans to test algorithms that were created specifically for coverings later this summer.
"With the arrival of the pandemic, we need to understand how face recognition technology deals with masked faces," said Mei Ngan, a NIST researcher behind the report. "We have begun by focusing on how an algorithm developed before the pandemic might be affected by subjects wearing face masks."
Ngan said NIST expects algorithms to improve at detecting people wearing face masks. The study also acknowledged limitations because the masks used in the tests were added digitally rather than real, physical coverings.
It meant key differences like physical masks not fitting perfectly like a digitally edited photo is, or different textures and patterns that real masks have.
"We can draw a few broad conclusions from the results, but there are caveats," Ngan said. "None of these algorithms were designed to handle face masks, and the masks we used are digital creations, not the real thing."
Facial recognition researchers have been compiling photos of people wearing face masks as data for its algorithms to learn from -- in some cases, without people's knowledge.
The NIST study used photos of people applying for immigration benefits and digitally altered mask photos of travelers entering the US, according to the report.
Watch this: Make your own gadgets to protect you from coronavirus