The Apollo 11 moon landing on July 20, 1969, was a landmark moment in space history. But what if the astronauts died during their mission to the moon, and President Richard Nixon had to deliver tragic news on TV to the American viewing public?
In this disturbingly real deepfake video, President Nixon breaks the news that
failed and astronauts died on the moon. Deepfakes are video forgeries that make people appear to be doing or saying things they aren't. Deepfake software has made manipulated videos accessible and increasingly harder to detect as fake.
"Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace," Nixon says in the deepfake video referring to astronauts Neil Armstrong, Buzz Aldrin and Michael Collins.
It took a half a year for Massachusetts Institute of Technology AI experts to create the very convincing 7-minute deepfake video that mixes actual NASA footage with Nixon delivering a tragic speech as though Apollo 11 had not succeeded in its mission to the moon.
Artificial intelligence "deep-learning" technology was used to make Nixon's voice and facial movements convincing. The contingency speech (which can be found in National Archives) was read aloud by an actor.
MIT's Center for Advanced Virtuality created its new project called In Event of Moon Disaster -- which launched on Monday -- to show people the dangerous influence deepfake videos can have on an unsuspecting public.
This marks the first time the Nixon Apollo 11 deepfake video is being presented to the public in its entirety following a physical art installation at MIT in fall 2019.
"In Event of Moon Disaster is an immersive art project inviting you into an alternative history, asking us all to consider how new technologies can bend, redirect and obfuscate the truth around us," the project's website said. "By creating this alternative history the project explores the influence and pervasiveness of misinformation and deepfake technologies in our contemporary society."
In Event of Moon Disaster aims not only help people to better understand deepfakes, but also explain how deepfakes are made and how they work; how to spot a deepfake; their potential use and misuse; and what is being done to combat deepfakes and misinformation.
This project is supported by a grant from Mozilla's Creative Media Awards, which build on Mozilla's mission to realize more trustworthy AI in consumer technology.