X

Facebook, Google, Twitter and TikTok grilled by lawmakers over child safety failures

Social media companies faced five hours of questioning by a UK parliamentary committee examining the incoming Online Harms Bill.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
4 min read
gettyimages-1097975504

All social media companies will be held accountable by the UK's incoming Online Safety Bill.

iStock / Getty Images Plus

Representatives from Facebook , Google , Twitter and TikTok all descended on the UK's Parliament on Thursday to be questioned about incoming regulation that would hold them to account.

For five hours, members of Parliament questioned executives from social media companies on their policies and tools, beating a similar Senate subcommittee hearing on Tuesday, which lasted only four hours. But both sets of lawmakers were interested in the same issues: safety, and in particular child safety. As was the case with the US senators, members of Parliament were highly skeptical about the various platforms' ability to keep young people out of harm's way.

"User safety has got lost in the corporate world somewhere," said MP Suzanne Webb. "It's been a bit of a theme of the day that everybody's got fabulous policies, but they're not landing in the lives of the users," added Baroness Kidron.

The UK's specific agenda differs from the one in the US, as it is examining draft legislation that will force companies to keep users safe -- and will punish them if they don't -- in the form of the Online Safety Bill.

The long-awaited legislation is among the first in the world of its kind and will see UK media regulator Ofcom appointed to hold social media companies accountable for safety failures. It will have the power to fine tech companies £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove harmful or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those companies consistently fall short of their obligations.

Executives from all of the companies were asked questions about who would be responsible for submitting to Ofcom the risk assessments they will be required to complete when the bill eventually becomes law, but all struggled to answer. 

When asked whether they had any concerns about implementing the bill, both Theo Bertram, TikTok's European director of government relations and public policy, and Nick Pickles, Twitter's senior director for public policy, said they were worried that bad actors may take advantage of the provisions built into the bill to protect politicians and journalists.

Google's vice president of public policy, Markham Erickson, encouraged the committee to tighten the definitions around online harms, while Pickles added that rules including the phrase "other illegal content" could be too open to interpretation.

Facebook's head of safety, Antigone Davis, said she was concerned about the need to do a risk assessment around every system or product change. "There's some danger that that will slow innovation, even, for example, in safety and security," she said. But, she added that she did agree risk assessments would be, broadly speaking, a "valuable tool."

Platform-specific problems

Members of Parliament also took the opportunity to pose questions to the executives about unique problems each of their platforms deal with.

For Twitter, the line of questioning centered on the racist abuse aimed at black football players after the England team lost the Euro Cup final this summer. TikTok was asked how it goes about limiting the spread of harmful viral content, such as the Tide Pod challenge, as well as how to protect filter bubbles from forming. Lawmakers took Google to task over evidence that suggested websites profit from hate speech through the company's programmatic advertising, and they raised the question of whether hate speech videos were being promoted to YouTube viewers by the company's algorithms.

But the most heated discussions and harshest criticisms were reserved for Facebook -- especially when it emerged Davis hadn't read the draft bill, which has been available since May. After Davis defended the actions Facebook was taking to reduce the number of girls exposed to content involving self-harm and suicidal ideation, member of Parliament John Nicholson responded: "Well, listen, it's not working, because the figures are too high."

Nicholson went on to reference the research released by children's charity the NSPCC earlier this week in which it was revealed that police in the UK record 24 online grooming crimes every week where offenders used Facebook-owned sites.

"All this rather suggest that Facebook is an abuse facilitator that only reacts when you're under threat, either from terrible publicity, or from companies, for example, like Apple , who threaten you financially," he said, referencing reports that Apple threatened to remove the Facebook app from its App Store due to human trafficking concerns.

MP Dean Russell said he was concerned that Facebook wasn't being supportive enough of the regulation, and was too preoccupied by pointing out how much money it had spent on improving safety and arguing it was doing a great job. (In September the company said it had spent $13 billion on improving safety since 2016.) 

"What we're hearing is where the massive gaps are and the harm that it's doing to young people in particular," he said. "I just wonder where we go from here, because do we need to strengthen this bill even further to force you to close those loopholes?"

Davis responded that the research the company conducted internally about safety, which was recently leaked to the press, was all designed to make the platform better, and that the appointment of a regulator to hold the company accountable was very welcome. 

Facebook didn't respond to request for further comment.