A ProPublica investigation last October into Facebook's automated advertising platform demonstrated people could target housing advertisements to -- and away -- from specific races and ethnicities. In response, the world's largest social network said it was revamping policies and tools to prevent that from happening.
But the social network's fix isn't working all the time, according to a report published on Thursday by ProPublica, which found Facebook is still letting discriminatory housing ads get through on its platform.
ProPublica said it bought dozens of housing ads on Facebook last week, and asked the system avoid showing them to certain audiences, including African Americans, people interested in wheelchair ramps, Jews and Spanish speakers. The report said every ad was approved by Facebook within minutes.
The groups excluded from seeing those ads are all protected under the Fair Housing Act, which prohibits people from discriminating against potential renters or buyer on the basis of "race, color, religion, sex, handicap, familial status, or national origin."
Facebook's ad policies have become a major point of scrutiny in the past year. Along with the concerns over fair housing, Facebook was also criticized because its ad platform allowed people to target automated interest categories including "Jew haters" and "why jews ruin the world."
The social network has also been in the hot seat over Russian trolls abusing it to meddle in last year's US election. Foreign agents used a combination of paid ads as well as unpaid "organic" posts to reach 126 million people on the platform. Facebook, along with rivals Google and Twitter, faced congressional hearings over the matter on Capitol Hill earlier this month.
In a statement, Ami Vora, vice president of product management at Facebook, blamed the incident on a "technical failure."
"This was a failure in our enforcement and we're disappointed that we fell short of our commitments," she wrote. "The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure."
In February, Facebook added additional safeguards to its ad platform, including more human reviewers and machine learning software, to try to stamp out abuse of its targeting tools. Vora also pointed out that the changes to the system flagged "millions" of ads that would have violated the policies.
She also said "tens of thousands" of advertisers have confirmed they are in compliance with Facebook's tighter restrictions. Right now, Facebook requires those compliance notifications from everyone who wants to post an ad for housing, employment or credit opportunities. But now Facebook will extend the requirement to all "advertisers who choose to exclude some users from seeing their ads on Facebook."
Rachel Goodman, staff attorney for the ACLU's racial program, said in a series of tweets on Wednesday that the incident shows Facebook needs to be more transparent.
"We're glad FB continues to express desire to get this right, but this story highlights the need for greater transparency and accountability for FB," she wrote. "Had outside researchers been able to examine the system FB created to catch these ads, they could have spotted this problem and ended the mechanism for discrimination sooner."
First published November 21, 3:58 p.m. PT
Update, November 22, 9:35 a.m. PT: adds comment from Rachel Goodman, staff attorney for the ACLU's racial program.
The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.
Special Reports: CNET's in-depth features in one place.