X

The FCC Just Made Using AI-Faked Robocall Voices Illegal: Here's How to Report These Calls

Singer Taylor Swift and President Joe Biden's voices are among those that have been impersonated by scammers.

Gael Cooper
CNET editor Gael Fashingbauer Cooper, a journalist and pop-culture junkie, is co-author of "Whatever Happened to Pudding Pops? The Lost Toys, Tastes and Trends of the '70s and '80s," as well as "The Totally Sweet '90s." She's been a journalist since 1989, working at Mpls.St.Paul Magazine, Twin Cities Sidewalk, the Minneapolis Star Tribune, and NBC News Digital. She's Gen X in birthdate, word and deed. If Marathon candy bars ever come back, she'll be first in line.
Expertise Breaking news, entertainment, lifestyle, travel, food, shopping and deals, product reviews, money and finance, video games, pets, history, books, technology history, generational studies. Credentials
  • Co-author of two Gen X pop-culture encyclopedia for Penguin Books. Won "Headline Writer of the Year"​ award for 2017, 2014 and 2013 from the American Copy Editors Society. Won first place in headline writing from the 2013 Society for Features Journalism.
Gael Cooper
3 min read
Five tiny robots around a red phone

The FCC says impersonating voices in robocalls is illegal.

James Martin/CNET

In January, a robocall impersonated President Joe Biden and told Democrats not to vote in the New Hampshire primary. Now the Federal Communications Commission has ruled that calls made with artificial intelligence-generated voices are illegal, giving states a new way to go after the people who create such calls. The ruling takes effect immediately.

The fake Biden robocalls originated with a Texas company, the New Hampshire attorney general said on Feb. 6, opening a criminal investigation.

"Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities and misinform voters. We're putting the fraudsters behind these robocalls on notice," said FCC Chairwoman Jessica Rosenworcel in a statement Thursday. "State Attorneys General will now have new tools to crack down on these scams and ensure the public is protected from fraud and misinformation." 

Since the 2022 release of ChatGPT, a chatbot from OpenAI, the capabilities of artificial intelligence have been making headlines. The ability to copy the voices of celebrities, politicians and others have drawn special attention, especially because of the many ways such familiar voices can be misused. A TikTok user called ghostwriter stirred controversy in 2023 for a song called Heart on My Sleeve, which used AI-created vocals to imitate musicians Drake and The Weeknd. And an AI-created voice and likeness of singer Taylor Swift made it seem that the Grammy Award-winning musician was endorsing Le Creuset cookware, though neither Le Creuset or the real Swift were involved. 

State attorneys general could already target the outcome of a scam call using AI voices, but now the mere act of using AI to generate unlicensed voices is illegal. The FCC statement says that this ruling should provide states with more legal backing to pursue cases against fraudsters. 

AI could help stop robocalls from getting through

The FCC has been working on this for months. In November, leaning on the Telephone Consumer Protection Act, the primary law used to limit junk phone calls, the commission began researching how AI was mimicking familiar voices for scam robocalls. That act gives the FCC the authority to fine robocallers and block calls from carriers facilitating illegal calls, and it allows consumers or organizations to sue robocallers.

The release from the FCC says the commission also wants to "turn [AI] into a force for good," seeking to use artificial intelligence for help with pattern recognition that could lead the tech to recognize illegal calls and prevent them from reaching consumers. 

A coalition of 26 state attorneys general recently wrote to the FCC supporting this approach. By taking this step, the FCC is building on its work to establish partnerships with law enforcement agencies across the country to identify and eliminate illegal robocalls. These partnerships can provide critical resources for building cases and coordinating efforts to protect consumers and businesses nationwide. The FCC offers partner states not only the expertise of its enforcement staff but also important resources and remedies to support state investigations.

How to report robocalls

Unwanted robocalls are the FCC's No. 1 complaint, and its top consumer protection priority, the commission says in a statement on its site.

Consumers receiving such calls, whether the calls use AI-created voices or not, can use the FCC's online form to file a complaint. You'll need to describe the call and give your email address. A representative for the FCC confirmed that this is the recommended way to bring such calls to their attention.

There are also a variety of call-blocking and labeling tools that consumers can use to ensure they don't answer such calls -- or receive them in the first place. The FCC has posted a guide to such tools, noting that many phone companies use rules allowing consumers to be automatically enrolled in call-blocking services. Consumers who are concerned they'll miss wanted calls can opt out. Phone companies also often offer call labeling, where a call comes through with "scam likely" or "spam" listed on the phone's display.

CNET also has a guide to stopping this kind of call. According to Robokiller, a company that specializes in blocking spam calls and robocalls, Americans received 3.34 billion robocalls in December, which averages out to 17 spam calls for every American.