Facebook, Google should audit algorithms that boost fake news, says UK House of Lords
Democracy is under threat from a "pandemic of misinformation," warns a report published by a parliamentary committee on Monday.
Katie CollinsSenior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
The global coronavirus pandemic has left governments, the
and citizens reeling, not just from the devastating effects of the virus, but from the slew of misinformation that has accompanied it. How best to tackle the spread of false information is a subject of global debate, especially with regard to just how much responsibility the tech platforms hosting it bear.
In the UK, the House of Lords Democracy and Digital Technologies Committee published a report on Monday featuring 45 recommendations for the UK government to take action against the "pandemic of misinformation" and disinformation. Failing to take the threat seriously would undermine democracy, causing it to "decline into irrelevance," it says.
The report examines the ways false information spread during the virus outbreak, and warned that misinformation is a crisis "with roots that extend far deeper, and are likely to last far longer than COVID-19."
"We are living through a time in which trust is collapsing," said David Puttnam, the committee chair in a statement. "People no longer have faith that they can rely on the information they receive or believe what they are told. That is absolutely corrosive for democracy."
Key among the recommendations are requests to hold big platforms, specifically
, accountable for their "black box" algorithms that control what content is shown to users. These companies denying that their decisions in shaping and training algorithms resulted in harm is "plain wrong," the report says.
Watch this: YouTube cracks down on voter misinformation ahead of 2020 election season
Companies should be mandated to conduct audits of their algorithms, to show what steps they take to prevent them from discriminating, the report says. It also suggests increased transparency from digital platforms about content decisions so that people have a clear idea about the rules of online debate.
Facebook and Google didn't immediately respond to request for comment.
Regulation: The Online Harms Bill
One of the report's primary recommendations is for the UK government to immediately publish its draft Online Harms Bill. The bill would regulate digital platforms like Google and Facebook, holding them accountable for harmful content and penalizing them when they failed to meet their obligations.
The progress of the bill has been slow, with a white paper published in May 2019, the government's initial response published in February this year and the full response, which was supposed to be published over the summer, delayed until the end of the year.
The government wasn't able to confirm to the committee whether or not it would bring a draft bill to Parliament by the end of 2021. As a result, the bill might not come into effect until late 2023, or even 2024, the report says. During a briefing ahead of the report's publication, Lord Puttnam described the delay as "inexcusable."
"The challenges are moving faster than the government and the gap is getting larger and larger," he said. "Far from catching up, we're actually slipping behind."
The report details the ways in which Ofcom, which would be the designated online harms regulator, should be able to hold the companies accountable under legislation. It should have the power to fine digital companies up to 4 percent of their global turnover or force ISP blocking of serial offenders, it says.
Online platforms are "not inherently ungovernable," it says as it urged the government not to "flinch in the face of the inevitable and powerful lobbying of big tech."
That story is not over yet, he added, but he was optimistic that Twitter's decision to take action against the president when he violated the platform's rules might have a knock-on effect.
"There's a sense that these large companies look at each other and when one makes a sensible shift in a sensible direction, the others feel very, constrained, very under pressure to make a similar shift," he said.
There have been many efforts across Europe and further afield to put pressure on big tech, not just to crack down on fake news, but also to pay more taxes and change their practices through antitrust decisions and privacy regulation. The success of these efforts so far is debatable, but Lord Puttnam and other committee members ultimately expressed their optimism that positive change would come to the tech industry.
If the government, which now has two months to respond to the report, embraces the committee's recommendations, it believes there is a chance that tech could support democracy and help restore public trust, instead of further undermining it.