Elections have always been a prime target for influence operations and disinformation campaigns, with the 2024 US Presidential Election being no different. Throughout the year leading up to the election, multiple campaigns by nation-state actors have been observed attempting to influence public opinion and sway votes to align with their own agendas. These false narratives have additionally been amplified using artificial intelligence (AI). Many campaigns have been identified as relying on AI to further push messaging over divisive issues without much effort from the perpetrators of the campaigns themselves. Countries such as Russia, China, and Iran are actively engaging in such influence operations to shape the narrative and outcome of the 2024 US presidential election. Their efforts involve adapting narratives to resonate with the US audience, exploiting existing divisions, and leveraging social media platforms to spread disinformation. Though the election may be over now, with Donald Trump set to become the next US President, further disinformation campaigns are anticipated in the months leading up to the inauguration, and likely beyond.

Disinformation campaigns linked to election interference

Russia’s Doppelganger Campaign

While the Russian influence campaign Doppelganger first emerged in 2022 amidst the backdrop of the Russia-Ukraine war, its narratives have continuously changed throughout the past two years to adapt to other global events. In 2024, particularly, the campaign shifted its narratives to focus on elections taking place globally, beginning with propaganda and disinformation being spread through news articles focussed on current socio-economic and geopolitical issues that aimed to influence public opinion prior to the European Parliament election. It comes as no surprise then that the campaign also leveraged the US presidential election to spread false narratives.

Doppelganger has employed various techniques to distribute disinformation, including the use of typosquatted domains that impersonate legitimate websites, especially news outlets, fake social media accounts that disseminate AI-written posts, deep fake videos, and more. In some instances, Doppelganger was also observed amplifying messages from CopyCop, another Russia-linked influence operation. By September 2024, the US Department of Justice (DOJ) had seized at least 32 internet domains linked to the campaign that were used to interfere in and influence the outcome of the US presidential election. The DOJ also indicted two employees of the Russian state-owned media, RT, for their role in creating and distributing content via an online content creation company in Tennessee, while keeping the connection to Russia hidden. Previously, Qurium researchers also identified that Doppelganger’s operators worked in close association with cybercriminals and affiliate advertisement networks based in Europe, with the discovery leading to Doppelganger struggling to maintain its operations. However, Doppelganger appears to be adapting its techniques, with continued activity observed to date. This includes pushing fake claims by prominent disinformation experts, such as Eliot Higgins and Christo Grozev, in an effort to cast doubt on Russia’s involvement in such influence campaigns. This shift in behaviour may be a potential reaction to the US DOJ affidavit in September which implicated the Social Design Agency firm in operating the Doppelganger influence campaign.

Iranian hack-and-leak operation

In or around May 2024, Iranian-linked threat actors began targeting and successfully gaining access to personal accounts belonging to persons associated with Donald Trump’s US Presidential campaign, including campaign officials. This access was then used to steal non-public campaign documents and emails. The activity expanded in June 2024, with the hackers engaging in a ‘hack-and-leak’ operation where they sought to weaponise the stolen campaign material by leaking it to members of the media and individuals associated with Joe Biden’s re-election campaign using the persona ‘Robert’. In fact, in August 2024, POLITICO reported having received emails from an anonymous account containing documents from inside former President Donald Trump’s campaign. Trump’s campaign confirmed that its internal communications had been hacked, pointing to a recent Microsoft report and blaming ‘foreign sources hostile to the United States.’

The Microsoft report identified activity linked to groups connected to the Iranian government that suggest an attempt at influencing the 2024 United States presidential election. Two types of activity were observed, the first of which concerned influence campaigns to stoke controversy and sway voters, and the second aiming to gain intelligence on political campaigns to influence future elections. Multiple Iranian Revolutionary Guard Corps (IRGC)-affiliated actors with links to election interference were identified, including Peach Sandstorm, Lemon Sandstorm, Mint Sandstorm, and Cotton Sandstorm. Google researchers observed similar activity, attributing the campaign to Iranian government-backed threat actor, APT42.

On September 27th, 2024, the US Department of Justice announced indictments against three Iranian nationals that are employees of the IRGC and have been linked to this hack-and-leak campaign. The activity was reportedly part of Iran’s ongoing efforts to incite discord, erode confidence in US electoral processes, and acquire information used to advance the malign cyber activities of the IRGC. Despite the indictments, the ‘Robert’ persona has contacted additional news publications with the allegedly stolen documents. Some of these publications have since published parts of the leaked documents after verifying their authenticity.

China’s influence operation Spamouflage

The Spamouflage influence operation has been active since at least 2019, possibly longer, and is widely considered to be conducted by the Chinese government. Similar to Doppelganger, the campaign adapts its themes to ongoing geopolitical events, with the aim of pushing narratives that align with the Chinese government’s goals. Spamouflage, also often tracked as Dragonbridge or Storm-1376, shifted its focus to the US presidential election in mid-2023.

Graphika researchers identified Spamouflage using social media accounts on platforms like X and TikTok that pretended to be US citizens, soldiers, or US-focused peace, human rights, and information integrity advocates frustrated by US politics and the West. The accounts seeded and amplified content denigrating Democratic and Republican candidates, casting doubt on the legitimacy of the US electoral process and spreading divisive narratives about sensitive social issues such as gun control, homelessness, drug abuse, racial inequality, and the Israel-Hamas conflict. Some of the content appears to have been generated by AI and has targeted Joe Biden, Donald Trump, and Kamala Harris, a trend also observed by other researchers in the run-up to the election. For example, Microsoft observed Spamouflage using AI-generated news broadcasts and AI-manipulated imagery to fuel conspiracy theories. While the primary candidates are more likely to be impacted as part of the Spamouflage campaign, other political candidates have also been targeted, including US Senator Marco Rubio. Clemson University researchers identified that Spamouflage has been targeting Rubio with fake news since his re-election bid in 2022. Following Rubio’s re-election in November 2022, the activity temporarily paused before picking up again recently in 2024. Hacked accounts have also been used to share images and paragraphs denigrating Rubio, while Medium has been used to amplify false narratives posted by these hacked accounts, with the content notably of a much higher quality than what was previously employed in 2022.

Disinformation and election interference trends

Use of AI in influence operations

The three state-sponsored campaigns all have one thing in common, namely their continuous development and adaptation not just in terms of narratives, but also in the techniques being leveraged. Just as enterprises globally are adapting to the increasing sophistication of AI, so are nation-state actors. According to the US Foreign Malign Influence Center, foreign actors are expected to continue to conduct influence operations, particularly through the use of AI-generated or enhanced social media posts. Some examples of AI-created content include Russia-linked deepfake videos about Kamala Harris and Tim Walz intended to discredit the two and promote controversy around their campaign. OpenAI also identified and disabled a cluster of ChatGPT accounts associated with an influence operation attributed to the Iranian hacker group, Storm-2035, that generated content about multiple topics, including the US presidential election. As discussed previously, the China-linked Spamouflage has also been increasingly relying on AI-created content to disseminate its narratives.

Adapting Narratives

Threat actors have also regularly adapted their campaigns to fit certain narratives, with many disinformation campaigns quick to jump on recent geopolitical events to drive new narratives. An example of this is how quickly misinformation and disinformation were spread surrounding the assassination attempt on Donald Trump by Thomas Matthew Crooks. This includes conspiracy theories claiming the incident was staged, posts speculating President Joe Biden, the Deep State, Antifa Activists, or Ukraine were responsible for the attack, and posts alleging the shooter is still alive.

In another instance, researchers at the Institute for Strategic Dialogue observed Russian state-affiliated media outlets and social media accounts exploiting the hurricanes Milton and Helene to promote discontent in the US. This involved promoting narratives that the Biden administration was diverting funds to Ukraine at the expense of US disaster relief, and that the US government’s hurricane response was symbolic of incompetence within the Biden administration. Other common themes leveraged by multiple influence operations include the ongoing Israel-Hamas war and the resulting protests at US universities.

Voter fraud claims

Although Donald Trump and his supporters, including Elon Musk, spread allegations of voter fraud during the campaign and voting period, election officials and monitoring agencies reported no significant issues with the 2024 election. This is similar to 2020, when Trump’s own administration repeatedly advised him that there was no evidence of widespread fraud, and subsequent legal challenges alleging voter fraud were rejected by courts.

The 2024 election saw a continuation of the same narratives about election fraud that emerged in 2020, despite a lack of evidence. The 2024 fraud claims have been debunked as misinformation as well and no credible evidence of widespread fraud has been found. Some examples of false voter fraud claims include social media posts of an alleged bus convoy transporting voters from New York to Pennsylvania to vote for Kamala Harris, when in reality it depicted volunteers participating in a Democratic canvassing event. Another post suggested that over 14,000 power outages in Northampton County, Pennsylvania, affected voting. Elon Musk also posted and then deleted a claim that Google was manipulating search results in favour of Harris. Google explained that the search results for ‘where to vote for Harris’ showed polling locations because ‘Harris’ is a county name in Texas, not due to bias. The company adjusted its algorithm to prevent similar issues. The FBI also warned about fabricated videos and statements claiming to be from them that warned about supposed voter fraud and threats to polling places. These fake messages are believed to be part of a Russian disinformation operation. On election day, hoax bomb threats, allegedly originating from Russian email domains, were directed at polling locations in at least five states, namely Georgia, Michigan, Arizona, Wisconsin, and Pennsylvania. The FBI described their effect as ‘minor’ and US officials stated that there was no evidence of malicious activity impacting the security or integrity of election infrastructure.

Conclusion: Disinformation Trends in the 2024 Presidential Elections

All disinformation campaigns aim to erode trust and spread discord and division. In the case of the 2024 US presidential election, the campaigns may also seek to encourage violent protests after the election. Disinformation campaigns often draw upon historical events, grievances, and prejudices to create a sense of familiarity, for example Trump’s claims of the previous election having been ’stolen’. Other interference attempts included hoax bomb threats at polling places across multiple states, resulting in the temporary closure of at least two polling locations and possibly dissuading some voters from casting their ballot. Disinformation uses the same techniques that are seen time again with typical fraud, relying on specific narratives to create a sense of urgency or crisis and encouraging people to act without critically evaluating the information they have been fed. By promoting these divisive narratives, the campaigns seek to influence voter opinions and sway the election outcome or cause unrest following election results that could then further influence political decision-making.

To learn more how the Silobreaker Intelligence Platform can help your organisation see the bigger picture, effectively plan, and address strategic business risks, including geopolitical risks, request a demo today.