X
Robert Rodriguez/CNET

QAnon channels delete their own YouTube videos to evade punishment

YouTube has banned almost 40 channels that use the tactic. Disinformation experts say they haven't seen the ploy used before.

In the YouTube video, a cartoon man in a black suit and blue tie stands in the corner of the screen spewing QAnon theories. He's pointing to the rest of the screen like a professor at a blackboard. The animation is superimposed over a picture of former President Donald Trump speaking to camouflaged military personnel. 

The cartoon man repeats themes you'd hear from devotees of the baseless pro-Trump conspiracy, which contends the world is run by a powerful cabal of Democrats and Hollywood elites who traffic children and worship Satan. The video tries to untangle "Q drops," online breadcrumbs from Q, the anonymous person or group behind the conspiracy. It mentions corrupt politicians on both sides of the aisle as being "primary targets in DC" while Trump is kept safe. It falsely claims President Joe Biden had been executed some time ago for "high treason" and what we're seeing now isn't real.

"You're watching a scripted movie with actors, doubles and CGI," says the cartoon man, waving his hand around as he talks.

The video -- titled TRUMP HAS HAD MILITARY INTELLIGENCE INFILTRATED 4NTIFA, with Antifa deliberately misrendered -- was published on April 27 and has since been removed from Google-owned YouTube. 

The content likely runs afoul of YouTube's policy of banning QAnon videos that could incite violence and is now purged from the video-sharing site. It wasn't deleted, however, by YouTube. Instead, the channel, called It's Time to Tell The Truth, took it down on May 5, eight days after it was published. "Video unavailable," a message on the YouTube video player now reads. "This video has been removed by the uploader." 

trumpvideo.png

This video was deleted voluntarily by its uploader in order to evade punishment from YouTube.

Screenshot by CNET

Disappearing videos are usually the realm of Snapchat or Instagram Stories, which self-destruct by design after 24 hours. The vanishing QAnon video is something different: a tactic used by peddlers of disinformation that's designed to help extremist channels evade YouTube's policies and escape violations that would get them shut down. The clip is just one of hundreds of deleted videos in a spam network of almost 40 QAnon and far-right YouTube channels examined by CNET that post conspiracy content as part of a coordinated effort, which appear to be operating from regions around the world but are falsely posing as American.

"This is being done for the purpose of not being kicked off the platform," says Gideon Blocq, CEO of VineSight, a company that uses artificial intelligence to analyze viral disinformation spreading on social platforms. "It's to avoid detection."

YouTube hauled in $6 billion in the first quarter of the year, almost 11% of Alphabet's more than $55 billion in revenue. Alphabet is the parent company of Google, the search giant that's holding its annual I/O developer conference next week.

The channels were discovered by Noah Schechter, a Stanford University student who conducts open source research. The pages were likely designed to exploit the video platform's advertising program, which places commercials before and inside videos. It games YouTube's three-strikes rule, eliminating violative content before it can be found. Under YouTube's guidelines, a first strike typically comes with a one-week suspension that prohibits the posting of new content. A second strike within a 90-day window comes with a two-week suspension. A third strike results in a permanent ban.

"After careful review, we have terminated the channels flagged to us by CNET for violating our spam policies," a YouTube spokesman said in a statement.

The evasion tactics come at an uncomfortable time for YouTube, which has made a new push at enforcement amid accusations that the massive platform has contributed to radicalizing white supremacists and neo-Nazis. Last month, YouTube introduced a new metric called the violative view rate that measures how many views offending videos received before they were pulled down. The practice of bad actors deliberately deleting videos could mean there are violative views left unaccounted.

Policing YouTube and other tech platforms is a game of whack-a-mole. Bad actors are continually honing tricks so their toxic posts slip past detection. Silicon Valley companies have faced a reckoning for disinformation and conspiracy content since the Capitol Hill insurrection, which was largely fomented and organized on social media. Google CEO Sundar Pichai, appearing virtually alongside Facebook's Mark Zuckerberg and Twitter's Jack Dorsey, was hauled before Congress in March to testify about the danger of misinformation on tech platforms. At the hearing, Pichai touted YouTube's enforcement against conspiracy content.

Evolving tactics

Schechter contacted CNET after reading our March 2020 investigation about a pro-Trump disinformation network that used novel tactics to dodge YouTube's security filters, such as hiring a voiceover actor or zooming in on images at different speeds to trip up YouTube's artificial intelligence safeguards. The new channels are vivid evidence that disinformation uploaders continue to evolve their evasion strategies and elude punishment from YouTube's automated systems or human content moderators. 

While disinformation experts say the tactic of systematically deleting videos is new, the playbook of peddling conspiracies for ad dollars isn't. During the 2016 US presidential election, a Macedonian village turned false news into a cottage industry, using Facebook and Google to post false stories with the goal of making money on ads.

In terms of reach, the QAnon channels that were removing their own videos weren't huge. Some of the most viewed videos got 150,000 views before they were deleted, while others got as few as 8,000 views. Since 2009, the It's Time to Tell the Truth channel has gotten 1.46 million views, though it's unclear how many of these were generated by QAnon or right-wing content. YouTube didn't answer questions about how much revenue the channels generated. Companies that were running ads against the videos included the sports brand Adidas, the guitar maker Fender and Google itself, which advertised its Webpass product for its Google Fiber internet service. 

Adidas and Fender didn't respond to requests for comment. As an advertiser, Google didn't comment.

googlefiberad.png

A Google ad on a video from the spam network.

Screenshot by CNET

It's clear the channels were part of a coordinated effort. Most of them had the same aesthetic identity, with names like The Patriots Movement, America Wonderful Moments and Liberty to Freedom. Their cover photos and avatars featured American flags, bald eagles and Minutemen. On their About pages, each of the channels had the same text listed in the "description" section: the preamble of the United States Constitution. Many of the videos trumpeted QAnon content, but some also featured mainstream conservative refrains, like complaints against voting machines and calls for state ballot audits.

All of the videos followed the same basic format, and the production was shoddy. They each had a cartoon man or woman at the corner of the screen, usually dressed in business attire, mouthing the words of a podcast or some other ripped audio. While some started with introductions from a podcast host, others begin with no context. Many of the videos ended mid-sentence. For example, the audio in the May 5 post about "primary targets in DC" was taken from a video series called Christian Patriot News, which normally shares its content on platforms including Brighteon and Gab, popular among right-wing crowds.

Christian Patriot News didn't respond to a request for comment.

Jared Holt, a resident fellow at the Atlantic Council's Digital Forensic Research Lab, a nonpartisan organization that combats and explains disinformation, says he's never seen channels pull down their own videos after letting them sit for a period. But there's some precedence for the behavior, he said. Some extremists on YouTube used to delete their live streams immediately after broadcast, in order to avoid punishment from the platform. Then they would archive their streams elsewhere. 

"They survived on YouTube for a long time, probably longer than they would have otherwise, by utilizing that tactic," he said. Holt hasn't previously written about the deleted live streams.

americawonderfulmoments.png

One of the channels in the spam network.

Screenshot by CNET

Removing its own videos may seem counterintuitive for a spam network, but the key to the operation is how little content is available on each channel. The pages uploaded about five new videos per day while deleting older videos after about a week. The channels typically had no more than about 30 videos in its library at any given time. The idea was to have a steady stream of videos replace the ones that were deleted. Each channel in the network promoted the same 30 videos. 

Deleting a video that's racking up views could mean leaving ad dollars on the table, but it comes down to cost-benefit analysis for channel operators. The loss of revenue may be worth it if it means avoiding the trouble of getting banned and setting up new channels and growing audiences, or repurposing old channels after hacking or buying logins, VineSight's Blocq says. An analysis performed by VineSight found that the spam network has deleted hundreds of videos to avoid detection.

It's unclear where the channels originated. Some contact information on the pages suggest ties to people in Vietnam, similar to the channels CNET investigated last year. At the time, YouTube confirmed those channels were operated from around the world, including prominently from Vietnam. The company didn't answer questions about the origins of the new channels or whether they were related to the ones from Vietnam. 

Requests for comment were sent to an email address associated with the channels, but they yielded no response. 

'Critical questions'

For YouTube, the phenomenon of voluntarily deleting videos undermines a major push in transparency the video site made last month: disclosing how many times people viewed content that breaks the platform's rules. 

The metric itself is tricky to gauge. Instead of using absolute numbers, the company presents the figure only as a range of percentages. For example, in the fourth quarter of 2020, the violative view rate was 0.16% to 0.18%. Put another way, about 16 to 18 views out of every 10,000 views on the platform were of videos that should have been removed. YouTube said the figure was down from three years earlier, when the rate was 0.64% to 0.72%. 

gettyimages-1230453493

The insurrection at the US Capitol was largely fomented on social platforms including YouTube.

Getty

But YouTube is gargantuan. The site sees more than 2 billion visitors a month, and 500 hours of video are uploaded every minute. YouTube doesn't disclose the number of videos it hosts on its platform, so without knowing that total, it's difficult to get a sense of how much banned content is being viewed. Because YouTube is so big, the deleted videos from the spam network do little to affect the violative view rate, but the tactic to game the system illustrates the opaqueness of the situation. 

When YouTube first debuted the metric last month, Jennifer O'Connor, a product management director at the company's trust and safety department, explained the thinking around content that slips past its enforcement. 

Imagine a hypothetical example where a video that violates YouTube's policies has been on the platform for about 24 hours but has gotten only one view, she told reporters. Now compare that with a video that's been up for 10 hours but has thousands of views. "Clearly the latter video is having more of a negative impact on our users," she said. "We actually think the critical questions to answer -- and what we've looked at over the last several years -- are: How safe are the users on YouTube? How often are they getting exposed to this type of content?"

By systematically deleting their own videos, channel operators have found a way to make those questions harder to answer.