- Russia has been using AI to influence the U.S. presidential election, favoring Donald Trump over Kamala Harris.
- Russia is generating more AI content to influence the election than any other country, according to an official from the Office of the Director of National Intelligence.
- China and Iran are also using AI for influence operations, but not specifically targeting U.S. election outcomes.
- The use of AI in election interference represents a significant escalation in sophistication, underscoring the need for vigilance and proactive measures to safeguard democratic processes.
In a recent revelation, a U.S. intelligence official has stated that Russia has been using artificial intelligence (AI) to influence the U.S. presidential election.
The official, who wished to remain anonymous, stated that the AI content generated by Russia was part of a broader effort to boost the candidacy of Republican Donald Trump over Democrat Kamala Harris. This information was shared during a briefing on the alleged use of AI by Russia and other countries to influence the November 5 vote.
The AI content produced by Moscow is consistent with Russia's broader efforts to boost Trump's candidacy and denigrate Harris and the Democratic Party, including through conspiratorial narratives. The Russian embassy in Washington did not immediately respond to a request for comment. Russia has previously denied interfering in the U.S. election.
Artificial intelligence, in its various forms, learns from past data to take actions. Using this training, it can create new content like text, pictures, and videos that appear to have been produced by humans.
Russia's Sophistication in AI Influence
The official from the Office of the Director of National Intelligence (ODNI) stated that Russia is generating more AI content to influence the November election than any other country. However, the official did not provide a volume of that AI content.
The official further stated that Russia was a much more sophisticated actor and had a better understanding of how U.S. elections work and appropriate targets. When asked how Russia was disseminating AI content, the official pointed to a July 9 Justice Department announcement of the disruption of an alleged Moscow-backed operation that used AI-enhanced social media accounts to spread pro-Kremlin messages in the U.S. and elsewhere.
In one instance, Russian influence actors staged a widely reported video in which a woman claimed she was a victim of a hit-and-run car accident by Harris. The video, however, was staged rather than produced through AI, according to the official. Microsoft confirmed last week that its research showed that video was the work of a covert Russian disinformation operation.
China and Iran's AI Influence Operations
China, on the other hand, has been using AI content in an attempt to influence how it is perceived worldwide, but not to sway the outcome of the U.S. election. China is using AI in broader influence operations seeking to shape global views of China and amplify divisive U.S. political issues, the official said. We are not yet seeing China use AI for any specific operations targeting U.S. election outcomes.
Iranian influence actors have also used AI to help generate posts for social media and write inauthentic news articles for websites that claim to be real news sites, the official said. The content created by the Iranian actors is in English and Spanish. It has targeted American voters across the political spectrum on polarizing issues such as Israel and the conflict in Gaza, and on the presidential candidates.
The Iranian mission to the United Nations did not immediately respond to a request for comment. Iran has previously denied interfering in the U.S. vote.
This is not the first time that foreign powers have been accused of interfering in U.S. elections. In 2016, U.S. intelligence agencies concluded that Russia interfered in the presidential election with the goal of helping then-candidate Donald Trump win. This interference was primarily through a campaign of disinformation and propaganda, much of it disseminated through social media platforms.
The use of AI in such influence operations represents a significant escalation in the sophistication of these efforts. AI can generate content that appears to be human-produced, making it more difficult to detect and counter. This development underscores the importance of ongoing efforts to secure the integrity of U.S. elections and to educate the public about the potential for foreign influence operations.