Clearview AI has spent months amassing its facial recognition database with more than 3 billion pictures of people it gathered from the internet. Now it faces a lawsuit in Illinois for taking those photos without people's consent.
Clearview AI had been forced out of the shadows after a profile by The New York Times in January, which detailed how the company planned to use facial recognition to identify people in real time. It's able to do that through its massive database of people's photos, which it gathered on social media platforms like Instagram, YouTube and LinkedIn without those people's consent.
"Clearview's practices are exactly the threat to privacy that the legislature intended to address, and demonstrate why states across the country should adopt legal protections like the ones in Illinois," the ACLU said in a statement.
Clearview AI didn't respond to a request for comment.
The ACLU said it is suing Clearview AI on behalf of organizations that represent survivors of sexual assault and domestic violence and undocumented immigrants. It stated that the surveillance technology offered by Clearview AI could enable abusive partners and government agencies to track and target vulnerable communities.
Watch this: Clearview AI's facial recognition goes creepier than most surveillance tech
The ACLU is working with the law firm Edelson PC, which also had a hand in the Facebook facial recognition lawsuit settled in January. The lawsuit is seeking a court order in Illinois to force Clearview AI to delete photos of Illinois residents gathered without consent, and to stop gathering new photos until it complies with the state's law.
If the lawsuit succeeds, this protection would only apply to residents of Illinois, as other states in the US don't have biometrics laws. Clearview AI has a "privacy request form" with a special section to opt out for residents in Illinois and California, which has its own state privacy law.
But Illinois' law specifically requires companies to get consent first rather than needing people to request exclusion. The opt-out process also involves people needing to upload a photo of themselves first before their images can be deleted from its database.
"If allowed, Clearview will destroy our rights to anonymity and privacy — and the safety and security that both bring," the ACLU said. "People can change their names and addresses to shield their whereabouts and identities from individuals who seek to harm them, but they can't change their faces."