Facebook Parent Meta Removes Deepfake Video of Ukrainian President Zelenskyy

The social network said the video violated its rules against manipulated media.

Queenie Wong
Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials 2022 Eddie award for consumer analysis
2 min read
Meta's Facebook

Facebook renamed itself Meta in October. 

James Martin/CNET

Facebook's parent company, Meta, said Wednesday that it removed a deepfake video of Ukrainian President Volodymyr Zelenskyy for violating the social network's rules against manipulated media. 

"It appeared on a reportedly compromised website and then started showing across the internet," said Nathaniel Gleicher, who heads security policy at Meta, in a tweet about the video. "We've quickly reviewed and removed this video for violating our policy against misleading manipulated media, and notified our peers at other platforms."

Videos known as deepfakes use artificial intelligence to create videos of people doing or saying something they didn't. Gleicher said the video of Zelenskyy made it appear the politician uttered a statement he actually didn't. Meta didn't identify the video or say what the statement was. CNET hasn't seen the video.

The removal of the video highlights the ongoing challenges social networks are facing as they try to curb the spread of misinformation after Russia's invasion of Ukraine. Some social media users have been posting old video footage on Twitter and Facebook to make it seem like they were recording events happening in real time. On TikTok, some people used old or out-of-context audio to create fake videos.

Meta doesn't always remove false content, leaning instead on directing users to authoritative sources or labeling misinformation. The social network partners with third-party fact-checkers to flag misinformation on its services, which also include photo-and-video app Instagram. Meta says it'll take down misinformation if there's a risk of physical harm or if the media is highly deceptive. 

"We remove this content because it can go viral quickly and experts advise that false beliefs regarding manipulated media often cannot be corrected through further discourse," Meta's rules against manipulated media says.