With UK election, Facebook faces fake-news dilemma
Facebook has introduced new tools and suggestions on combating false information, but is it enough?

This is the second story in a two-part series that looks at the impact of social media and fake news on the UK elections.
Facebook had an undeniable impact on the US presidential election last year. Could it happen again in the UK?
With the UK general election underway, questions about how fake news may sway the results have once again emerged. Rightly or wrongly -- and at various points it's probably been both -- the debate has centred on Facebook.
It's yet another example of the uncomfortable role that Facebook plays as a media titan, a place where nearly 2 billion users check in at least once a month to share and consume information. It's a topic that's also a priority at the company itself, with CEO Mark Zuckerberg vowing earlier this month to better combat hoaxes and fake news.
British citizens will cast a vote on Thursday to elect their local member of parliament, and the leader of whichever party gains a majority of elected MPs will become prime minister and form a government.
What is Facebook doing?
Facebook has made clear strides to help users question and identify fake news. The social network has tested new tools and introduced features and advice. The company has struck partnerships with fact-checking organisations as well.
Facebook's ad campaign featured in major British newspapers.
After the French presidential elections earlier this year, the social network revealed it had deleted over 30,000 fake accounts that were contributing to the problem. Ahead of the UK election, it was so concerned about the problem that it took out newspaper adverts across the country to provide people with tips to help them recognise fake news when they see it.
It's a good show, but is it enough?
"My overall impression is that Facebook has begun to take these issues seriously and give them their attention," said Drew Margolin, professor of communications at Cornell University. "They are no longer ignoring the problem or hoping it goes away."
But Margolin is concerned that the social network still hasn't committed to a set of principles or ideals that articulate clear priorities, which means they are fighting the problem "Whac-A-Mole style."
"We hope that Facebook will also recognise how much more they can do to make it easier for users to spot false news online," Phoebe Arnold, head of communications and impact for Full Fact, a UK-based nonprofit and nonpartisan fact-checking organisation, said in a blog post. "The launch of this educational campaign [around the general election] is useful and timely but it should just be the start."
There are reports, including this one from the Guardian, that some of Facebook's efforts might not be working, with tags not being applied fast or consistently enough to intervene before news goes viral. There is even some suggestion that articles flagged as potentially fake end up attracting even more traffic, although Facebook denied this to the Guardian, saying: "We have seen that a disputed flag does lead to a decrease in traffic and shares."
Facebook did not respond to multiple requests for comment.
But it's hard for researchers, even ones working closely with Facebook, to see if these interventions are having any meaningful effect.
A team at Oxford University's Internet Institute has been trying to investigate the true extent of the fake news problem in order to find solutions, but its efforts have been stunted by the fact that Facebook's doors are closed to them. According to John Gallacher, a doctoral researcher in cybersecurity at the institute, Facebook offers "no access to data."
That's why the Oxford team has looked to Twitter for data instead.
Gallacher believes it might help social networks tackle the problem if they start seeing themselves as news outlets, given that they are "the biggest source of political news and information for young people."
Algorithms, ads and filter bubbles
Facebook frames itself as a nonpartisan arena that plays host to political debate, but thanks to filter bubbles, fake news and algorithms -- about which we know very little -- this is not quite the case.
Ads like this appear to be targeted at specific British voters.
Political campaigners also dabble in this game by investing heavily in targeted Facebook advertisements that place their messages in front of voters' eyes.
British citizens have little real insight into how they are being targeted, although a project called "Who Targets Me?" aims to change this. Through the medium of a Chrome extension, the project will analyse the political ads you're being shown on social media and analyse why they are showing up in your feed.
"As predicted, we've seen a big increase in the amount of Facebook campaigning from all the parties, and it's also clear that all sides are devoting considerable resources to targeting adverts," said Jimmy Tidey, a PhD candidate at the Royal of College writing on the blog for "Who Targets Me?".
With political parties paying millions to Facebook in order to target users with ads using their data, it's hard for Facebook to argue that it plays a totally neutral role in the democratic process. The Information Commissioner's Office, the British watchdog for data, last month launched an investigation into the use of data analytics of this kind for political purposes.
"This is a complex and rapidly evolving area of activity and the level of awareness among the public about how data analytics works, and how their personal data is collected, shared and used through such tools, is low," said Information Commissioner Elizabeth Denham. "It is important that there is greater and genuine transparency about the use of such techniques to ensure that people have control over their own data and the law is upheld."
The timing of the investigation is coincidental, but its closeness to the general election is particularly pertinent. Last weekend the Conservative party was accused of circulating fake news of its own via paid-for ads on Facebook. A video edited to show Labour leader Jeremy Corbyn refusing to condemn IRA bombings went viral after the Conservatives paid to promote it, even though he did condemn the attack.
Jeremy Corbyn, leader of the Labour party.
"The Conservatives are running a hateful campaign based on smears, innuendo and fake news," a spokesman for Corbyn told the Guardian.
The Conservatives refused to apologise, saying: "We encourage all readers to watch it and share with their family and friends."
It goes to show that Facebook's fake news problem is not just about news stories. It is about adverts too -- adverts that it is directly profiting off.
Transparent to advertisers, the company could not be more opaque when it comes to sharing data about fake news. As such, no one -- not even those working in partnership with the social network to identify problematic news sources -- can know the real extent of the problem or how effectively it is being tackled.
And herein lies the problem for those wanting to help fix it.
"Until we know the scale of the problem on Facebook, there aren't really any solutions we can propose," said Monica Kaminska, also a doctoral researcher in cybersecurity on the Oxford team. "If you can't analyse it and you can't investigate it, it's difficult to say what the remedies are.
Read part 1: Fake news, shoo! The UK general election doesn't want you
Logging Out: Welcome to the crossroads of online life and the afterlife.
Virtual reality 101: CNET tells you everything you need to know about VR.