Tech companies that fail to keep children safe online could face huge fines in the UK

Senior managers could also face criminal charges and services could be blocked altogether if tech companies fail to comply with the country's Online Safety Bill.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
3 min read

Tech companies will have an obligation to keep children safe.

Reggie Casagrande/Getty Images

If new  legislation  successfully comes into play in the UK, tech companies there could face major financial penalties for failing to keep children safe online  or not removing racist and other harmful content. A draft copy of the country's Online Safety Bill, which has been years in the making, was published by the government on Wednesday after being announced in the Queen's Speech. It's expected to come before Parliament in the next few months.

The Online Safety Bill, previously known as the Online Harms Bill, is a key piece of legislation that would place UK media watchdog Ofcom in charge of regulating tech companies in Britain. Ofcom will have the power to fine tech companies £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove harmful or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those companies consistently fall short of their obligations.

"It's time for tech companies to be held to account and to protect the British people from harm," UK Home Secretary Priti Patel said in a statement. "If they fail to do so, they will face penalties."

The bill would charge tech companies with a new duty of care to their users, which would require them to remove not only content that's illegal but also content that could be considered harmful, including information about self-harm and suicide, and misinformation. Companies would also be held responsible for fraudulent content posted by users -- in particular financial scams that aim to manipulate other users into parting with their money.

Special measures have been added to the bill to protect content posted by politicians and journalists, to safeguard democracy and protect against unnecessary censorship.

The decision to regulate tech companies and the internet more closely isn't unique to the UK, and it represents a broader agreement internationally that tech companies need to abide by rules and should be held accountable when they don't. Not everyone agrees what that regulation should look like, however.

Civil liberties campaigners, along with other groups, say the government has missed the mark with the draft legislation presented this week. Matthew Lesh, head of research at think tank the Adam Smith Institute, tweeted that the Online Safety Bill was "shaping up to be a totally incoherent train wreck" by putting "extreme pressure" on tech companies to remove content, including lawful speech.

There's also concern that the bill doesn't go far enough to protect young people. The legislation "risks falling short" if it fails to tackle the complexities of online abuse, Peter Wanless, chief executive of the National Society for the Prevention of Cruelty to Children, said in a statement.

"Unless government stands firm on their promise to put child safety front and centre of the Bill, children will continue to be exposed to harm and sexual abuse in their everyday lives which could have been avoided," Wanless said.

Before the bill comes before Parliament, it'll be scrutinized by a joint committee of members, who will agree on a final version to be debated and voted on. Ofcom Chief Executive Melanie Dawes said in a statement that the regulator will also shortly lay out how it thinks the new rules will work in practice, "including the approach we'll take to secure greater accountability from tech platforms."