X

San Francisco tapping AI to reduce racial bias in making criminal charges

The technology will help Lady Justice keep her blindfold on, says San Francisco District Attorney George Gascón.

Corinne Reichert Senior Editor
Corinne Reichert (she/her) grew up in Sydney, Australia and moved to California in 2019. She holds degrees in law and communications, and currently writes news, analysis and features for CNET across the topics of electric vehicles, broadband networks, mobile devices, big tech, artificial intelligence, home technology and entertainment. In her spare time, she watches soccer games and F1 races, and goes to Disneyland as often as possible.
Expertise News, mobile, broadband, 5G, home tech, streaming services, entertainment, AI, policy, business, politics Credentials
  • I've been covering technology and mobile for 12 years, first as a telecommunications reporter and assistant editor at ZDNet in Australia, then as CNET's West Coast head of breaking news, and now in the Thought Leadership team.
Corinne Reichert
2 min read
Golden Gate Bridge Shot on iPhone 8 Plus

San Francsico wants implicit bias removed from criminal charging decisions.

James Martin/CNET

San Francisco District Attorney George Gascón says the city is using artificial intelligence to remove racial bias from the process of deciding whom to charge with crimes. A new AI tool scans police reports and automatically redacts any race information.

It's part of an effort to remove implicit bias caused by social conditioning and learned associations, Gascón's office said in a press release Wednesday.

"Lady Justice is depicted wearing a blindfold to signify impartiality of the law, but it is blindingly clear that the criminal justice system remains biased when it comes to race," Gascón said. "This technology will reduce the threat that implicit bias poses to the purity of decisions."

Stage one of the tool, bias mitigation review, removes details that could be connected to race, such as officer, witness and suspect names; officer star numbers; specific neighborhoods and districts; and hair and eye color.

Once investigators record a preliminary charge, they'll gain access to the unredacted incident report and body camera footage. This second stage is called full review, and prosecutors will be required to record why the unredacted information led to any changes in their charges.

This information will be used to refine the AI tool, which is set to be fully implemented by the SFDA's general felonies teams from July 1, 2019.

The tool, reported on earlier by the San Francisco Chronicle, was created by the Stanford Computational Policy Lab at no cost to Gascón's office.

The tool's lead developer, Stanford assistant professor Sharad Goel, said it'll "reduce unnecessary incarceration."

San Francisco barred its police officers from using facial recognition in May, citing a breach of citizens' civil liberties.