The chair of the Federal Communications Commission yesterday proposed making calls using voices generated by artificial intelligence artificial voices under the Telephone Consumer Protection Act, which would make the voice cloning technology used in robocall scams illegal. The move is likely a reaction to a robocall that was made last week to thousands of residents of New Hampshire, on the eve of that state’s primary, and used an AI-generated voice that sounded like President Joe Biden.
Interestingly, the release announcing the proposal didn’t specifically mention how the FCC would take this step – through a declaratory ruling or some other form of rulemaking. The intention, however it chooses to proceed, is to provide more resources to state Attorneys General to go after scammers and hold them accountable, according to the FCC.
In the release, the FCC noted that it recently launched a Notice of Inquiry seeking information and comments about how artificial intelligence is impacting robocalls and robotexts. One of the goals was to determine how AI fits into the FCC’s responsibilities under the TCPA.
“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate. No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls,” said FCC Chairwoman Jessica Rosenworcel in a statement. “That’s why the FCC is taking steps to recognize this emerging technology as illegal under existing law, giving our partners at State Attorneys General offices across the country new tools they can use to crack down on these scams and protect consumers.”