X

Google removes QAnon apps from Play Store for violating terms

The search giant says it's combating "harmful information."

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
2 min read
google-hq-sede-mountain-view.jpg

Google removed QAnon apps from its Play Store.

Getty

Google said on Thursday it had removed three apps related to the QAnon conspiracy theory from its Play Store digital marketplace. 

The apps -- called QMAP, Q Alerts! and Q Alerts LITE -- were taken down for violating Google's policies against "harmful information," the company said. The removal was earlier reported by Media Matters for America, a progressive not-for-profit.

The QAnon conspiracy theory has become popular among a group of supporters of President Donald Trump. One claim is that celebrities are involved in child sex trafficking and pedophilia. Another tenet is that Trump is working to take down the so-called "Deep State," a secret network that manipulates and controls government policy. The theory revolves around "Q," an anonymous user who began writing about the conspiracies on imageboard site 4chan. 

"Now more than ever, combating misinformation on the Play Store is a top priority for the team," a Google spokesman said in a statement. "When we find apps that violate Play policy by distributing misleading or harmful information, we remove them from the store."

Google didn't answer questions about how many times the apps had been downloaded. The search giant also didn't say how much money it might have made from the apps, some of which reportedly required a payment to download.

Google isn't the only tech giant that's removed QAnon content in recent weeks. Earlier this month, Facebook took down five pages dedicated to the conspiracy theory for violating rules against inauthentic behavior.

The search giant more broadly has been dealing with misinformation on its services. YouTube has been struggling to remove uploads of the Plandemic, a 26-minute viral video that includes conspiracy theories about the novel coronavirus.