Deepfakes freak YouTubers out. Vidcon offers a way to prepare

The convention for online creators kicked off with a panel on how to prepare for the deepfake dilemma.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
4 min read
Young fans of Joey Graceffa hold masks of his face in front of their own.

Fans at VidCon in 2016 celebrated online video star Joey Graceffa with masks of his face. 


Not far from a frenzy of screaming tweens chasing online video stars, in a glass building overlooking Disneyland, VidCon introduced a new celebrity to its pantheon: video forgeries known as deepfakes. But with the ability to manipulate video to make anyone do anything, people may instead be running in fear. 

VidCon, which started in earnest Thursday, is the world's biggest conference of online video and digital creators, but it's best known for its swarms of fans. About 75,000 people were expected to attend this year, most of them preteens, teens and their parents overrunning the Anaheim, California, convention center in the hope of a close encounter with an online idol. Pockets of the immense crowds tend to spontaneously combust into a screaming mob at the hint of an influencer nearby.

But this year, at the moment the expo halls open, the first thing on VidCon's agenda is a presentation about the risks of deepfakes as part of the an industry-focused track of panels and keynotes. 

"I fundamentally think this sort of synthetic video is going to be more and more important for online video," said Jim Louderback, the general manager of Vidcon. "I remember when we used to believe in photographs. We're going to say in a couple of years 'I remember when we used to believe in videos.'"

Computer manipulation of video has existed for decades, but deepfakes are video forgeries that can make people appear to be doing or saying things they never did, generated in an automated way with artificial intelligence. And rapid advances in deepfake technology mean these doctored clips are becoming both easier to make and harder to detect. At a time when even clunky video manipulations, like a slowed-down video of Nancy Pelosi, are effective tools of misinformation, the prospect of sophisticated deepfakes opens a mess of threats. 

VidCon's event, titled "Deepfakes and Synthetic Video: Don't Panic, Prepare," is aimed at the different audiences that attend VidCon, according to Sam Gregory, a program director at the human-rights organization Witness, that is leading the presentation. 

One of the most damaging aspects to deepfakes currently isn't disinformation, he said, but how deepfakes can allow people to become more entrenched in the notion that anything they don't want to believe is fake. And that's where VidCon's most vaunted attendees come in. 

"The creators, they have such a powerful role," Gregory said. "They can push back on the rhetoric, demystify and explain what's happening. And offer people practical things they could be doing in terms of media literacy." 

(One request from Gregory: Please nobody recommend blinking patterns as a deepfake detection trick anymore. In a testament to how quickly deepfake technology stays ahead of detection methods, algorithms adapted to that shortcoming within months of an academic paper pointing out blinking as a deepfake tell.)

Watch this: We're not ready for the deepfake revolution

For people who work at companies where deepfakes could go viral, Gregory wants to send a message that they need to figure out how they'll identify so-called synthetic media and "how do they avoid moderation choices that don't backfire."

Though concerns about deepfakes lean toward apocalyptic, most deepfakes that are publicly seen today on massive platforms like YouTube, Reddit or Facebook are harmless goofs. One meme plasters actor Nicolas Cage's face into a potpourri of movies and shows he's never starred in, with him as Indiana Jones or every actor on Friends. Tweets sticking Steve Buscemi's mug on Jennifer Lawrence or clips that graft Elon Musk's face onto a baby go viral for their weirdness.

This period when viral deepfakes are generally harmless is when social platforms should be getting ahead of deepfakes as a threat, he said.

"There's a real opportunity to prepare here, and to prepare in a way that I don't think platforms did well in previous waves of misinformation," Gregory said. 

Neal Mohan, YouTube's product chief, pointed the company's track record on manipulated video in the past. 
"You saw the policy that we have in place and how quickly it allowed us to deal with the Nancy Pelosi video," Mohan said in an interview. "It's a combination of making sure we have the right policy, community guidelines, and then obviously continuing to work to develop technology to detect those things."

Vidcon's industry track, which caters to creators and people who work in the online video industry widely, has more than 150 speakers and nearly 80 workshops this year. This deepfakes presentation, all told, represents about 2% of the overall industry programming, said Louderback. 

But "I'm glad that we're leading with covering it this year," Louderback said. But in a burgeoning world of synthetic media, the emergence of fully-formed virtual influencers may be the next uncharted territory. "Who knows, maybe next year we'll bring in a whole bunch of deepfaked influencers and creators."

-Richard Nieva contributed to this story

The story originally published on July 10 at 5 a.m. PT. 
Update, 9 a.m. PT: Includes additional background. 
Update, July 12 at 5 a.m. PT: Includes a quote from YouTube.