BETA
This is a BETA experience. You may opt-out by clicking here

Breaking

Edit Story

Russia’s 2024 Election Influence Campaign Has Started, Microsoft Analysis Finds

Following

Topline

The latest Russian-backed election influence campaign—aided by some new AI-driven tactics—is underway ahead of the 2024 presidential election, though it appears to have gotten off to a slower start than in past election cycles, according to a new report from Microsoft’s Threat Analysis Center.

Key Facts

Russian election influence actors—the same ones that have been identified in past campaigns—have shown new signs of activity over the last 45 days, though their activity appears to be occurring at “a slower tempo” than observed ahead of the 2016 or 2020 presidential elections, which was likely due to a “less contested” primary season this year, Microsoft said.

Russian influence efforts in this election cycle have focused heavily on turning U.S. opinion against Ukraine and NATO, with Microsoft’s analysis finding at least 70 Russian actors using both traditional media and social media to spread Ukraine-related disinformation over the last two months.

Microsoft has identified several Russia-affiliated actors behind the influence operations—including the group Microsoft refers to as Storm 1099, which was responsible for the wide-reaching “Doppelganger” misinformation campaign in 2022.

The “most prolific” of these actors, however, are affiliated with the Russian Presidential Administration, which Microsoft says highlights “the increasingly centralized nature” of these influence campaigns—a shift away from the 2016 and 2020 campaigns that were more closely associated with Russia’s so-called Internet Research Agency and intelligence services.

How Is Artificial Intelligence Affecting These Campaigns?

Artificial intelligence is shaping how these influence actors operate—though not as significantly as experts have long feared, and not in the ways that officials have expected. While the emergence of AI raised fears that so-called deepfake videos could be used to deceive and manipulate the public, Microsoft says such efforts have been largely unsuccessful, generally failing to fool audiences or gain traction. Instead, audiences have been more likely to fall for “simple digital forgeries”—fake news stories with fake logos, for instance—which influence actors have used for years. Microsoft says AI was more convincing when used to alter or enhance existing content than when used to generate content from scratch, and even then, AI-generated audio is generally more convincing than videos. The report also found that AI-generated content about lesser known figures was more likely to fool audiences than content about well-known figures.

What To Watch For

These groups often follow similar tactics for circulating misinformation. One actor that Microsoft refers to as Storm-1516, for instance, typically introduces misinformation on video channels, purporting the source to be a whistleblower or independent journalist. The group then uses a network of websites it covertly operates to amplify the information, prompting it to be further picked up and, ultimately, deceive audiences.

Key Background

The U.S. has long accused Russia of using social media and other online tools to try and influence U.S. politics and stoke division, particularly near election time. Those efforts first came to light in 2016, when Russian-sponsored entities were identified trying to sow discord and back then-candidate Donald Trump.

Contra

Russia has long denied that it attempted to meddle in any U.S. election, and last month vowed it would not meddle in the 2024 election—despite mounting evidence from U.S. cybersecurity experts and intelligence agencies. Forbes contacted the Russian embassy in Washington for comment on Microsoft’s report but did not immediately hear back.

Further Reading

ForbesChina Eying Election Disruption Campaigns-Including With AI, Microsoft Says
Follow me on TwitterSend me a secure tip