Want CNET to notify you of price drops and the latest stories?

San Francisco tapping AI to reduce racial bias in making criminal charges

The technology will help Lady Justice keep her blindfold on, says San Francisco District Attorney George Gascón.

Corinne Reichert Senior Writer
Corinne Reichert (she/her) grew up in Sydney, Australia and moved to California in 2019. She holds degrees in law and communications, and currently oversees the CNET breaking news desk for the West Coast. Corinne covers everything from phones, social media and security to movies, politics, 5G and pop culture. In her spare time, she watches soccer games, F1 races and Disney movies.
Expertise News
Corinne Reichert
2 min read
Golden Gate Bridge Shot on iPhone 8 Plus

San Francsico wants implicit bias removed from criminal charging decisions.

James Martin/CNET

San Francisco District Attorney George Gascón says the city is using artificial intelligence to remove racial bias from the process of deciding whom to charge with crimes. A new AI tool scans police reports and automatically redacts any race information.

It's part of an effort to remove implicit bias caused by social conditioning and learned associations, Gascón's office said in a press release Wednesday.

"Lady Justice is depicted wearing a blindfold to signify impartiality of the law, but it is blindingly clear that the criminal justice system remains biased when it comes to race," Gascón said. "This technology will reduce the threat that implicit bias poses to the purity of decisions."

Stage one of the tool, bias mitigation review, removes details that could be connected to race, such as officer, witness and suspect names; officer star numbers; specific neighborhoods and districts; and hair and eye color.

Once investigators record a preliminary charge, they'll gain access to the unredacted incident report and body camera footage. This second stage is called full review, and prosecutors will be required to record why the unredacted information led to any changes in their charges.

This information will be used to refine the AI tool, which is set to be fully implemented by the SFDA's general felonies teams from July 1, 2019.

The tool, reported on earlier by the San Francisco Chronicle, was created by the Stanford Computational Policy Lab at no cost to Gascón's office.

The tool's lead developer, Stanford assistant professor Sharad Goel, said it'll "reduce unnecessary incarceration."

San Francisco barred its police officers from using facial recognition in May, citing a breach of citizens' civil liberties.