Inner workings of the Russian nation-state “Doppelganger” influence campaign were exposed Wednesday when the U.S. Department of Justice (DOJ) published an affidavit detailing internal documents, web domains and online accounts used in the campaign.
The DOJ is in the process of seizing 32 internet domains used by three Russian government-sponsored organizations — Social Design Agency (SDA), Structura National Technology and ANO Dialog — to spread disinformation in support of Russian interests, which included efforts to influence U.S. voters ahead of the 2024 presidential election.
“The influence operation landscape has changed significantly since previous elections. New technologies and techniques have been employed by adversarial actors, we’ve seen the rise of disinformation-as-a-service and financially motivated actors, and we’re beginning to see the use of generative AI technologies, though their employment has so far been limited,” Lisa Kaplan, CEO of online risk mitigation technology company Alethea, told SC Media.
How Doppelganger group used cybersquatting, social media in propaganda campaigns
Doppelganger has been active since at least 2022 and is tied to multiple individuals known to be working under the direction of the Russian Presidential Administration of Vladimir Putin, including First Deputy Chief of Staff of the Presidential Executive office Sergei Vladilenovich Kiriyenko.
Kiriyenko and other individuals listed in the affidavit were previously sanctioned pursuant to executive orders declaring a national emergency regarding conflict between Russia and Ukraine, making their use of U.S.-based domains a violation of the International Emergency Economic Powers Act (IEEPA), DOJ authorities said.
Additionally, several domains were seized on the grounds of trademark infringement, as they hosted websites designed to impersonate legitimate news sites such as The Washington Post and Fox News.
Doppelganger used cybersquatted domains such as washingtonpost[.]pm and fox-news[.]top to host web pages nearly identical in appearance to the real publications, but containing articles designed to sway the reader’s sentiment toward positions favorable to Russian interests. For example, some articles portrayed the United States’ support of Ukraine in a negative light, while others sought to arouse negative feelings toward particular U.S. political candidates or parties.
Links to these articles were spread through thousands of comments on social media sites, posted by accounts with fake identities that hid their Russian origins. Internal documents circulated by members of the Doppelganger group revealed at least three distinct campaigns. For example, one document outlined the creation of a “meme factor” with the goal of posting about 200 memes about the Russia-Ukraine conflict per month.
Doppelganger also monitored and targeted online influencers and worked with internet personalities to spread content supporting the campaign’s agenda. Further details of this social media influencer campaign were revealed in an indictment also published Wednesday, which was filed against Kostiantyn Kalashnikov and Elena Afanasyeva, both employees of the Russia-controlled RT media outlet (formerly known as Russia Today).
The indictment alleges that the defendants spent nearly $10 million to create and distribute propaganda content through the social media channels of a Tennessee-based content creation company, garnering millions of views. While not named or charged in the indictment, the U.S. media company that published the content has been identified by reporters as Tenet Media.
“The indictment and sanctioning of those involved in the Tenet Media operation and its links to state media outlet Russia Today increases education and awareness of the public, as well as with the media and influencers who may be unwittingly approached by state adversaries,” said Kaplan. “The better the problem is understood, the better equipped democracies are at inoculating their citizens to the potential ill effects of malign foreign influence.”
Will generative AI worsen election disinformation?
Doppelganger was previously revealed to have used OpenAI’s ChatGPT to generate anti-Ukraine and anti-U.S. comments on the X and 9GAG social media sites earlier this year, although many of these comments were quickly called out as being from “Russian bots” by other users, OpenAI noted in a May 2024 report.
The DOJ’s affidavit noted that Doppelganger also used generative AI to create content for social media ads targeting U.S. politicians and identified five OpenAI accounts used to generate and edit articles and comments.
While the role of AI in the Doppelganger campaign was relatively small, it marks a continuing evolution in influence campaigns between past elections and 2024 U.S. election season, Sean Guillory, Lead Scientist for Booz Allen Hamilton’s Cognitive Domain/Dimension Products and Capabilities, told SC Media. AI-enhanced versions of “Russian Troll Farms” could potentially serve propaganda to a wider audience at lower effort and cost.
“In the run up to the 2015 election, Troll Farms were able to reach 140 million Americans a month. The adoption of Generative AI and Large Language Models has the potential to see this accelerate far beyond 2016. Now, LLMs have the potential to significantly increase the ‘bang for the buck’ in disinformation campaigns,” said Guillory.
Guillory said tools like GPTZero, an AI-powered tool that can help detect content generated by ChatGPT, is one example of the technology that can be utilized in the battle against disinformation this election season.
“Another effort to combat disinformation is the DISARM Foundation, an organization established to build a common framework similar to the MITRE ATT&CK framework for cybersecurity. The DISARM Framework is an attempt to use an understanding of adversarial tactics, techniques, and procedures for crafting and executing disinformation campaigns to find ways to detect and mitigate or disrupt them,” Guillory said.
© Copyright 2024 CNB Tel. All rights reserved