A new Federal Communications Commission ruling on Thursday makes it illegal for robocallers to use AI-generated voices.
In a unanimous decision, the FCC expanded the Telephone Consumer Protection Act (TCPA) to cover robocall scams, including AI voice clones. The new rules take effect immediately and allow companies to be fined or have their providers blocked for making these types of calls.
“Malicious actors are using AI-generated audio in unsolicited robocalls to blackmail vulnerable families or to steal the lives of celebrities,” FCC Chairwoman Jessica Rosenworcel said in a statement Thursday. “They are copying and giving false information to voters.” “We are issuing a warning to the scammers behind these robocalls.”
The move comes days after the FCC and New Hampshire Attorney General John Formella identified Life Corporation as the company behind mysterious robocalls imitating President Joe Biden before last month's state primary. I was disappointed. At a press conference Tuesday, Formella said his office has opened a criminal investigation into the company and its owner, Walter Monk.
Last week, the FCC first announced plans to update the TCPA to outlaw AI-generated robocall scams. The agency has used the law in the past to pursue nuisance callers, including conservative activists and pranksters Jacob Wall and Jack Berkman. In 2021, the FCC fined the company more than $5 million for running a massive robocall scheme to discourage voters from voting by mail in the 2020 election.
“This generative AI technology is new and has many challenges, but we already have many of the tools needed to address those challenges,” Nicolas Garcia, policy advisor at Public Knowledge, told WIRED. “We can apply existing laws like the TCPA, and regulators like the FCC have the flexibility and expertise to address these threats in real time.”