Instagram, YouTube and Facebook could be fined millions over harmful content

The UK's crackdown on toxic online content will soon mean fines, restrictions or suspensions for tech companies that fail to take action.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
2 min read

The UK is taking action against toxic online content.

Florian Gaertner/Getty Images

Instagram , Facebook and YouTube could face huge fines for failing to remove toxic online videos as part of a crackdown by the UK government on harmful social media content. The government said on Monday that under new rules due to be introduced next year, tech companies would have to pay up to 5% of their revenue or even face restriction or suspension of services if they fail to remove content.

The government will appoint telecoms and broadcasting regulator Ofcom to ensure that social media platforms are preventing the spread of content that includes or promotes violence, child abuse or pornography. The watchdog will take charge of policing social media from Sept. 19, 2020, as an interim measure, ahead of a "super-regulator" being appointed to govern harmful content on the internet.

"These new rules are an important first step in regulating video-sharing online, and we'll work closely with the Government to implement them," said an Ofcom spokesman in a statement. "We also support plans to go further and legislate for a wider set of protections, including a duty of care for online companies towards their users."

Watch this: Facebook FTC settlement puts Zuck personally on the hook

Harmful online content has been around as long as the internet itself, but the growth of social media has increasingly raised questions about whether online platforms are doing enough to tackle the problem. In the UK, the conversation picked up pace following the 2017 suicide of 14-year-old Molly Russell, who had been using Instagram to view self-harm imagery.

In April of this year the government announced it would unleash the world's first independent regulator to keep social media companies in check, following the publication of the Digital, Culture, Media and Sport Committee's whitepaper on online dangers. The requirements will not only apply to tech titans such as YouTube, but to file-hosting sites, online forums, messaging services and search engines. But with Silicon Valley giants boasting annual revenue in the multiple billions, it is these companies that could also take the biggest hits if they are issued with fines.

According to the Telegraph, the new rules were quietly given the green light earlier this summer, and will be consulted on this summer before being signed off by Parliament. Their implementation has been enabled by an EU directive that extends regulation of TV and video-on-demand services to also include video-sharing platforms.

"We urge Government to take a balanced and proportionate approach and ensure that the implementation is consistent with its wider approach to Online Harms," said Antony Walker, deputy CEO of industry body TechUK, in a statement. "Key to achieving this will be clear and precise definitions across the board, and a proportionate sanctions and compliance regime."

Smart displays let Amazon, Facebook, Google show you answers to your questions

See all photos