X

Wikipedia's disinformation task force braces for a high-stakes election

Facebook, Google and Twitter aren't the only ones preparing for the worst.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
4 min read
google-hq-sede-mountain-view.jpg

Wikipedia has formed a disinformation task force for the US presidential election.

Getty Images

In the lead-up to Election Day, Facebook , Twitter and Google have faced the brunt of the pressure to fight disinformation. Another crucial website, however, is also girding itself for a possible battle with false information: Wikipedia. 

The user-generated encyclopedia is a common jumping-off point for people looking stuff up online, and researchers tout the site's surprising reliability. Billions of people use Wikipedia each month for everything from primers on complex topics to settling arguments with friends. 

The authority the site has built up over the years is an unexpected turn for a website once considered a joke. Google uses snippets from Wikipedia directly in its search results. YouTube , which Google owns, links to Wikipedia pages in its information panels, which are short blurbs that appear under false or misleading videos as part of YouTube's effort to debunk misinformation. (For election information panels, Google and YouTube will be using results from the Associated Press.) 

Wikipedia's vast reach is one reason it's so important for the online encyclopedia to get it right on Election Day, and in the days after. 

"If it's wrong on Wikipedia, it can be wrong everywhere," Ryan Merkley, chief of staff at the Wikimedia Foundation, the nonprofit behind Wikipedia, said in an interview. "That's an enormous public trust."

For Election Day, Wikipedia formed a disinformation task force. Dozens of people across the foundation's security, product, legal and communications teams have set up protections for the website, guiding the hundreds of thousands of unpaid volunteers who edit its pages. For example, the main page for the 2020 US election will be locked down so it can be edited only by people with accounts older than 30 days and who have more than 500 edits under their belt. 

The task force has held hours-long video sessions where team members role-play different disinformation scenarios and how they'd handle them. The goal is to ward off coordinated attacks. Merkley declined to discuss any of those scenarios for fear of giving bad actors a roadmap.

Merkley, formerly the CEO of Creative Commons and COO of Mozilla, said election disinformation was one of the first things he was asked to address when he took the chief of staff role last year. 

WIkipedia's plan comes as the biggest tech companies in the world fortify their platforms for the election, their biggest test since the 2016 vote. Silicon Valley companies have been eager to prove they can avoid the mistakes they made the last time around, when Russia exploited Google, Facebook and Twitter in an effort to influence the outcome of the contest. 

Because of the coronavirus pandemic and the increase of mail-in voting, experts anticipate delayed results as ballots are tabulated in the days following the contest. People sowing misinformation could take advantage of that delay to try to create confusion and spread conspiracy theories, experts say. Merkley said the Wikipedia task force is prepared to keep its protections in place until Inauguration Day, if necessary.

Major tech companies have made several changes to their platforms, hoping to stamp out election misinformation. A sampling of announcements made in recent days: YouTube will label election videos and search results with the warning, "Results may not be final." Twitter is making a similar change, reminding users in its app that results could be delayed. Instagram, owned by Facebook, said it would temporarily remove the Recent tab from hashtag pages, in an attempt to reduce false information that could pop up around the election.

Watch this: Facebook, Twitter and Google face Congress over free speech

Google and Facebook will also temporarily ban political advertisements after the polls close to try to prevent people from running ads falsely claiming victory. Twitter announced last year that it would ban political ads about candidates.

Merkley said one of Wikipedia's strengths is that it isn't ad-supported, so it doesn't have targeted algorithms. Asked if he thinks Google, Facebook and Twitter are prepared, he said he doesn't have a "window" into their operations. 

"They're certainly much, much larger and much better funded than we are, and have very different challenges than we do," he said. "They are very hopeful that on the other side of election night, they're not looking down the end of a congressional hearing asking, 'Why did you cause the outcome of the election to go this way or that way?'"

Asked about their election preparations, Facebook and Google pointed to blog posts on their efforts. Twitter said it has "made a number of significant policy, product and enforcement updates" to protect the integrity of election conversation on the platform.

Wikipedia meets regularly with those companies, other tech platforms and government agencies including the FBI to share knowledge about election security. "Those conversations have been helpful to get a bigger picture of what everyone is trying to get ready for," Merkley said.