STRATEGIC ASSESSMENT. Foreign Information Manipulation and Interference (FIMI) in the context of elections has received increased attention in Western democracies since the post-mortem analysis of the 2016 U.S. presidential election that revealed extensive Russian meddling in the information environment. Since, election interference by adversaries has proliferated. All indicators, five days out of the next U.S. presidential election, point to foreign adversaries, including China, Russia, and Iran, meddling in the information environment to propagate narratives in their strategic interest and have grown significantly more sophisticated in their tactics, techniques, and procedures (TTPs) to go undetected by a sizeable number of American social media users and newsreaders.
Analysis of the content that these three countries have covertly pushed to U.S. social media users as well as scrutiny of the tactics they employ indicate differing strategic objectives but overall, show that the ‘Axis of Upheaval’ is aligned in their aim of sowing division and distrust among the American public. Russia, for example, has sought to propagate and amplify content against the Democratic Party, and the Harris-Waltz ticket, pushing extensive disinformation campaigns including far-fetched narratives such as Kamala Harris having perpetrated a hit-and-run in 2011 or illegally poaching in Zambia. Meanwhile, Kremlin-backed campaigns have boosted Republican candidate Donald Trump as well as other candidates running for Congress that have shown themselves skeptic of military support to Ukraine and have championed an isolationist U.S. foreign policy. Russia’s overall aim is influencing the U.S. public towards a president and government that it believes will let it act with relative impunity in its region.
Iran, meanwhile, is interested in boosting the Harris-Waltz ticket, doubtful of what another Trump presidency may mean for its increasingly weakened position in the Middle East as Israel continues to target the so-called Iranian-backed ‘Axis of Resistance’ in Lebanon, Gaza, and Yemen. In September, the U.S. Office of the Director of National Intelligence, the FBI and the Cybersecurity and Infrastructure Security Agency announced that Iranian cyberhackers had offered stolen, non-public information on Donald Trump to the Biden campaign – another brazen instance of interference. Meanwhile, Microsoft’s Threat Analysis Center revealed that hackers connected to Iran’s government have tested U.S. election websites in swing states for vulnerabilities.
Capitalizing on outrage regarding the humanitarian catastrophe in Gaza, Iran has also been involved in inciting protests within the United States in the lead-up to the election, posing as activist groups online and funneling aid to some protest groups within the U.S., according to the U.S. Director of National Intelligence Avril Haines. Most recently, the Microsoft Threat Analysis Center reported that Iran deployed an online persona in October that presented as an American and called on the U.S. public to boycott the elections due to both candidates’ support for Israel’s military operations in Gaza and beyond. Iran’s aim clearly lies in having a more absent U.S. presence in the Middle East, including through its aid to Israel.
Chinese FIMI efforts show Beijing is primarily interested in criticizing candidates in covert operations based on their stance on China, U.S.-China relations, as well as the status of Taiwan, in congressional races. Targeted officials include Representative Barry Moore, Senator Marco Rubio, Senator Marsha Blackburn, and Representative Michael McCaul. This network, dubbed “Taizi Flood” by Microsoft’s Threat Analysis Center showed that China is moving away from only going after critics in early October, when it started promoting Senator Blackburn’s opponent in the 2024 election, Representative Gloria Johnson.
None of this is new. However, what is so particularly striking about the FIMI in the lead-up to the 2024 U.S. elections are the growing sophistication and evolving TTPs used by actors such as Russia and China to interfere in the information environment. This reflects multiple causes; a quest for untrace-ability and sound covert operations, trial and error in testing what content and TTPs are effective with the U.S. public, and evolving moderation policies on social media platforms.
In September, the U.S. Department of Justice accused two Russians of having funneled $10 million to Tenet, a Tennessee-based media company secretly backing a group of conservative American commentators including Tim Pool and Dave Rubin. While these U.S. influencers say they did not know they were being covertly paid for their content on Tenet by the Kremlin, it does show that Russia increasingly understands the importance of a ‘trusted and established messenger’ to propagate narratives in its interest – a significant shift from its heavy reliance on troll farms that make it harder to garner views and engagement.
Another instance highlighting the increasingly sophisticated TTPs employed in the FIMI targeting the 2024 U.S. elections is the use of cybersquatting domains, which entails mimicking legitimate websites under a slightly different domain. The well-documented Russian campaign dubbed “doppelganger” has for example mimicked legitimate news outlets, such as “washingtonpost.pm,” to report on issues in Russia’s strategic interest, including criticism of Ukrainian officials and the Biden Administration’s southern border policy. This is another key point of sophistication: knowing that the average U.S. citizen may not be focused as much on Ukraine, they focus instead on the immigration crisis on the southern border and attempt to link it to the current administration’s Ukraine policy. Unsurprisingly, AI-generated and AI-enhanced content have become a common feature of FIMI in the 2024 U.S. elections. Russia, for example, was behind deceptive videos in which U.S. Vice President Kamala Harris seems to make irresponsible comments about the assassination attempts on former President Donald Trump. While AI-generated content has seen an uptick, Russian campaigns continue to rely on simple tactics as well, as seen with the staged video of a purported park ranger in Zambia that accused Vice President Kamala Harris of poaching.
Effective FIMI is resource intensive, especially as platforms make it hard to create bot accounts. To effectively interfere in elections that may see roughly 168 million registered voters cast ballots – as there were in the 2020 U.S. election cycle, the ‘Axis of Upheaval’ is expending financial resources to build out the infrastructure to spread narratives and compelling content creation across geographies and demographics in the United States. This is further made clear by the payment scheme Russia had put in place to fund Tennessee-based Tenet to promote Russia-aligned narratives. For all the expenses incurred, measuring the impact of FIMI efforts remains incredibly challenging. However, it does not have to necessarily reach many individuals: the targeting of the U.S. public in swing states means only a moderate number of people need to be deceived and influenced to have a potentially outsized impact.