Deepfakes may ruin the world. And they can come for you, too
But -- bright side! -- videos with your face on somebody else’s body usually aren't as tantalizing to bad guys as, say, creating political chaos.
Joan E. SolsmanFormer Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
ExpertiseStreaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation onlineCredentials
Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Thinking about deepfakes tends to lead to philosophical head-scratchers. Here's one: Should you worry about your face being grafted into hard-core pornography if deepfakes are bent on sabotaging global power?
Deepfakes are video forgeries that make people appear to be doing or saying things they never did. Similar to the way Photoshop made doctoring images a breeze, deepfake software has made this kind of manipulated video not only accessible but also harder and harder to detect as fake.
And chances are, unless you've scrupulously kept your image off the internet, a deepfake starring you is possible today.
"All of those images that you put of yourself online have exposed you," said Hany Farid, a Dartmouth researcher who specializes in media forensics to root out things like deepfakes. "And you gave it up freely. Nobody forced you to do it, it wasn't even stolen from you -- you gave it up."
Deepfakes represent a different, more malicious kind of
. Traditional facial recognition already plays a growing role in your life: It's the technology that helps you find all the snapshots of a specific friend in Google Photos. But it also could scan your face at an airport or concert without your knowledge.
Unlike most facial recognition, which essentially turns the features of your face into a unique code for computers, deepfake software aims to mash up identity so well you don't even question its truth. It poses a nightmare scenario not just of ruining your life, but also of manipulating the public's perception of heads of states, powerful CEOs or political candidates.
That's why media forensics experts like Farid, and even researchers for the Pentagon, are racing to find methods to detect deepfakes. But Matt Turek, the manager of the Pentagon's deepfakes program at DARPA, has said it's much easier to make a convincing deepfake today than it is to detect one.
Deepfake technology figures out how various points of a human face interact on camera to convincingly fabricate a moving, speaking human -- think a photorealistic digital puppet. Artificial intelligence has fueled the rapid development of deepfakes, but it's a technology that must also be fed a diet of facial images to produce a video.
Unfortunately, the rise of deepfakes has arrived after more than a decade of online social sharing put almost everyone's face on the internet. But staying out of the public eye doesn't inoculate anyone from deepfakes, because in today's world, almost everyone is exposed.
Here's another fun deepfake headscratcher: How bad does something have to be for Reddit and Pornhub both to ban it?
Deepfakes come in different shapes, sizes and degrees of stomach-sinking monstrosity. There are three main types, but the simplest and most widely known is a face swap.
But they can be insidious too, like the face of an unwitting victim grafted onto graphic pornography. This weaponized form of face swap has violated famous women, like Scarlett Johansson and Gal Gadot. But it's also made victims of others who aren't celebrities. This involuntary pornography is what's prohibited by both Reddit and Pornhub.
The main asset that somebody needs to create a deepfake of you is a collection of a few hundred images of your face. Because deepfake software uses machine learning, it needs data sets of your face and another face in a destination video in order to swap them convincingly. That's one reason celebrities and public figures are such easy targets: The internet is packed with source photos and videos to build these image stockpiles.
Your best protection against becoming the star of a deepfake depends on the lengths to which you're willing to go to keep your image out of anyone else's hands -- including keeping it off the internet. (So, yeah, good luck with that.)
A few hundred images of you may sound like a lot to gather, but these don't need to be individual still shots or selfies. Multiple frames pulled from one or more videos can fill in the gaps. Everytime an
shot a video of you, it was capturing at least 30 frames per second.
And quality trumps quantity in a deepfake dataset. The ideal is a wide selection of facial images without blurring or obstructions, from a variety of angles and with a range of facial expressions. The quantity needed can decrease if the angles and facial expressions are well coordinated with the desired destination video.
These quirks of the data sets can yield bizarre advice about how to reduce your exposure. Wearing heavy makeup is good protection, especially if you change it up a lot.
Obstructions in front of a face, even brief ones, are particularly tricky for deepfake technology to work around. But the defenses that exploit that weakness aren't necessarily helpful. Farid once joked about a potential defensive strategy with a politician. "When you're talking with everyone around, every once and a while just wave your hand in front of your face to protect yourself," he recounted telling him. The politician indicated that wasn't a helpful idea.
Deepfake programs for face-swapping are readily available free online, making the technology relatively accessible for anyone with motivation, some simple technological know-how and a powerful computer.
Other types of deepfake are more sophisticated. Thankfully, that means you're less exposed to being a victim. Unfortunately, these are the ones that harbor more dangerous possibilities.
Comedian and filmmaker Jordan Peele publicized one of these kinds of deepfakes, called an impersonation or "puppet master" fake, by posing as President Barack Obama in a deepfake video a year ago. Peele impersonates Obama's voice, but the deepfake video synthesized a new Obama mouth and jaw to be consistent with the audio track.
However, the creation of that video actually required a reassuring degree of practiced skill. Peele's script was designed so his speech would match the ebbs and flow of Obama's original head movements and gestures. And the success of the vocals was rooted in Peele's well-honed Obama impersonation.
But a higher level of sophistication, termed deep video portraits, are like deepfakes on steroids. While most manipulation in deepfake videos is limited to facial expressions, an international team of researchers transferred three-dimensional head position and rotation, eye gaze and eye blinking from one source actor to another target actor.
The result is a bit like a motion-capture sequence, without actually needing to capture motions when the videos were shot. With two ordinary videos, the researchers' program synchronized the movements, blinks and eye direction onto somebody else's face.
But the ultimate threat of deepfakes isn't how sophisticated they can get. It's how willingly the public will accept what's fake for the truth -- or believe somebody's false denial because who even knows what's true anymore?
"The public has to be aware that this stuff exists … but understand where we are with technology, what can and cannot be faked -- and just slow the hell down," Farid said. "People get outraged in a nanosecond and start going crazy. Everybody's got to just slow the fuck down."
Originally published April 3, 5 a.m. PT. Update, April 4: Adds background about DARPA.