CNET también está disponible en español.

Ir a español

Don't show this again

Politics

California laws seek to crack down on deepfakes in politics and porn

The laws go after malicious video forgeries.

US-IT-MEDIA-POLITICS

Deepfakes have been known to make politicians appear to do and say unusual things. 

ALEXANDRA ROBINSON/AFP/Getty Images

While some deepfakes are silly and fun, others are misleading and even abusive. Two new California laws aim to put a stop to these more nefarious video forgeries. 

California Gov. Gavin Newsom on Thursday signed AB 730, which makes it illegal to distribute manipulated videos that aim to discredit a political candidate and deceive voters within 60 days of an election. He also signed AB 602, which gives Californians the right to sue someone who creates deepfakes that place them in pornographic material without consent. 

Now playing: Watch this: We're not ready for the deepfake revolution
7:07

Deepfakes are video forgeries that make people appear to do or say things they didn't. They use a type of facial recognition technology to mash up identity so well you don't even question its truth. Deepfake software has been used to graft the faces of women -- celebrities like Scarlett Johansson as well as everyday people with photos online -- onto graphic pornography. 

A study released on Monday found that 96% of the more than 14,000 deepfake videos identified online were pornographic in nature. Of these pornographic deepfakes, the subjects of all of them were women, often popular actresses and musicians, according to the study by cybersecurity firm Deeptrace.

Deepfakes also gained attention earlier this year after videos of House Speaker Nancy Pelosi, doctored to make it appear she was drunkenly slurring words during a speech, spread on social media. These types of video manipulations -- that remove context from a video or make other malicious changes -- are sometimes called shallowfakes. California Assemblymember Marc Berman, who authored AB730 and AB 602, said deepfakes and altered videos like the one of Pelosi can mislead voters and disrupt elections. 

"Voters have a right to know when video, audio, and images that they are being shown, to try to influence their vote in an upcoming election, have been manipulated and do not represent reality," Berman said in a release. "In the context of elections, the ability to attribute speech or conduct to a candidate that is false – that never happened – makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters."

Originally published Oct. 4.
Update, Oct. 7: Adds details from Deeptrace study.