Facebook whistleblower advises UK lawmakers on enforcing Online Safety Bill

If left to conduct their own risk assessments, social media companies may "bury their head in the sand," said Sophie Zhang in a parliamentary hearing on Monday.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
3 min read
UK Houses of Parliament

The UK's Houses of Parliament as seen across Westminster Bridge.

Jorg Greuel via Getty Images

Facebook whistleblower Sophie Zhang gave UK lawmakers her perspective on how best to implement the pending Online Safety Bill in a hearing in Parliament on Monday. The former Facebook data scientist used her inside knowledge of moderation practices to answer questions from the Draft Online Safety Bill joint committee about how to ensure that  tech companies comply with upcoming legislation that would see them more tightly regulated in the UK.

The Online Safety Bill, previously known as the Online Harms Bill, is a key piece of legislation that would place UK media watchdog Ofcom in charge of regulating social media platforms in the name of keeping users safe. Ofcom would have the power to fine tech companies £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove harmful or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those companies consistently fall short of their obligations.

Zhang told members of Parliament that if social media companies are left to conduct their own risk assessments, that may incentivize them not to acknowledge their own problems internally. "If you bury your head in the sand and pretend that the problem doesn't exist, then you don't have to report as much to Ofcom," she said via video link.

Zhang's appearance before the parliamentary committee comes exactly one week before another Facebook whistleblower, Frances Haugen, speaks with members of Parliament. Earlier this month, Haugen testified before the US Congress, alleging that its products "harm children, stoke division and weaken our democracy." 

According to her whistleblower testimony, Zhang has direct experience of bringing problems to executives' attention only for them to turn a blind eye until it was too late.

Zhang previously revealed that in her role as a data scientist on Facebook's team investigating "fake engagement," she told the company that fake accounts were being used to distort the outcome of presidential elections in Honduras. On Monday she said she had personally briefed the company's vice president of integrity, Guy Rosen, on the matter, but it took the company nine months to launch an investigation and almost a year to take any action.

Zhang's suggestion for British lawmakers was that instead of leaving policing to platforms, Ofcom and other external experts should conduct their own experiments to test how effective social media companies are at keeping users safe. She also advised that platforms should provide better data access to trusted researchers for more independent verification. She noted that "this does create some privacy risks. Aleksandr Kogan" -- whose work with Cambridge Analytica led to a Facebook privacy scandal in 2018 -- "after all, was also a university researcher."

In a tweet after the evidence session, Zhang noted that she didn't have time to address the ongoing debate in the UK about banning end-to-end encryption (brought about by politicians' fears about the circulation of terrorist and child abuse content). She said she's "strongly opposed" to the idea, joining the chorus of privacy campaigners who argue that encryption keeps users safe and that introducing back doors would make people more vulnerable to hacks.