A Democratic political consultant has been indicted over his role in launching artificial intelligence-generated robocalls imitating United States President Joe Biden.
According to the New Hampshire Attorney General’s Office, on May 23, New Orleans political consultant Steven Kramer, who was working for rival candidate Dean Phillips, was indicted for impersonating a candidate during New Hampshire’s Democratic primary election.
He allegedly used AI to generate and send thousands of robocalls to New Hampshire residents imitating President Biden’s voice, asking people not to vote.
The spoofed calls relayed a message asking people to “save [their] vote for the November election” adding, “Your vote makes a difference in November, not this Tuesday.”
Attorney General John Formella brought 13 felony voter suppression charges and 13 misdemeanor impersonation charges against the 54-year-old.
The Federal Communications Commission proposed fining Kramer $6 million, saying that the deepfake robocalls violated caller ID rules.
The phone company that was found to have transmitted the calls, Lingo Telecom, is now facing a proposed FCC fine of $2 million for “incorrectly labeling [the calls] with the highest level of caller ID attestation and making it less likely that other providers could detect the calls as potentially spoofed.”
“I hope that our respective enforcement actions send a strong deterrent signal to anyone who might consider interfering with elections, whether through the use of artificial intelligence or otherwise,” said Attorney General Formella.
Kramer defended his actions, according to an NBC news report in February. In it, he claimed to have planned the fake robocalls from the start as an act of civil disobedience to call attention to the dangers of AI in politics.
“This is a way for me to make a difference, and I have,” he said in the interview at the time before adding, “For $500, I got about $5 million worth of action, whether that be media attention or regulatory action.”
Related: President Biden’s campaign team is hiring a master of memes
There have been growing concerns about the potential for AI-generated content to mislead voters ahead of the 2024 elections.
The Biden campaign said it has prepared for threats like malicious AI-generated deepfakes and has “assembled an interdepartmental team to prepare for the potential effects of AI this election,” according to Reuters.
In March, Cointelegraph reported on the increase in AI-generated deepfakes this election season stating that it will be crucial for voters to spot them.
In February, twenty of the largest AI tech companies pledged to prevent their software from influencing elections.
Magazine: ‘Sic AIs on each other’ to prevent AI apocalypse: David Brin, sci-fi author