'Operation Undercut' Adds to Russia Malign Influence Campaigns
Summary:
The Social Design Agency, recently accused by the US government of operating the "Doppelgänger" malign influence campaign, is now running a parallel effort called "Operation Undercut." This new campaign targets audiences in the United States, Ukraine, and Europe, with the primary goal of eroding support for Ukraine in its war against Russia. Like Doppelgänger, Operation Undercut extends its interference to broader areas, including the Middle East conflict, internal EU political dynamics, and the 2024 US presidential election. Researchers at Recorded Future’s Insikt Group describe the campaign as part of Russia’s broader strategy to destabilize Western alliances, cast Ukraine’s leadership as corrupt and ineffective, and diminish the flow of Western military aid to Ukraine. Additionally, the campaign seeks to portray US and EU involvement in Ukraine as misguided and largely ineffective.
Operation Undercut employs advanced tactics, including AI-enhanced videos distributed across social media. These videos, produced in multiple languages such as English, Russian, and German, are designed to appear authentic and often reference recent events or depict political figures in fabricated scenarios. Examples include a deepfake of Ukrainian President Zelensky suggesting NATO would supply Ukraine with Israel’s annual arms stockpile, a video portraying US President Biden as escalating the Ukraine war before Trump’s potential return to office, and another showing Trump forcing Zelensky to surrender. The campaign also spreads misinformation that mimics reputable media outlets, which suggest rising global support for Russia and doubts about Ukraine’s war strategies.
Analyst Comments:
Researchers also uncovered links between Operation Undercut and Storm-1516, a Russian influence network known for disinformation campaigns, including false narratives about the 2024 US elections. Notable examples of Storm-1516's contributions include a fake story about Zelensky buying a villa in Italy during an international summit and a deepfake video of a Hamas leader issuing threats against the 2024 Olympics. Recorded Future attributed Operation Undercut to SDA after identifying shared elements, such as cartoons resembling those found in SDA-leaked documents and Doppelgänger-affiliated websites.
While Operation Undercut shares some similarities with Doppelgänger, it uses distinct tactics. Doppelgänger primarily relies on fake websites promoted by bots, whereas Undercut focuses on directly posting content to social media and amplifying it through localized hashtag spam to target specific audiences. Despite the sophistication of these efforts, engagement with the content has been limited among its intended audiences. However, the US government has taken proactive measures, including seizing 32 domains associated with Doppelgänger. These domains, used to promote fake content and cycled through paid social media ads and influencers, demonstrate the evolving nature of SDA’s influence operations.
MItigation:
https://www.darkreading.com/cyberse...on-undercut-russia-malign-influence-campaigns
https://www.recordedfuture.com/rese...multifaceted-nature-sdas-influence-operations
The Social Design Agency, recently accused by the US government of operating the "Doppelgänger" malign influence campaign, is now running a parallel effort called "Operation Undercut." This new campaign targets audiences in the United States, Ukraine, and Europe, with the primary goal of eroding support for Ukraine in its war against Russia. Like Doppelgänger, Operation Undercut extends its interference to broader areas, including the Middle East conflict, internal EU political dynamics, and the 2024 US presidential election. Researchers at Recorded Future’s Insikt Group describe the campaign as part of Russia’s broader strategy to destabilize Western alliances, cast Ukraine’s leadership as corrupt and ineffective, and diminish the flow of Western military aid to Ukraine. Additionally, the campaign seeks to portray US and EU involvement in Ukraine as misguided and largely ineffective.
Operation Undercut employs advanced tactics, including AI-enhanced videos distributed across social media. These videos, produced in multiple languages such as English, Russian, and German, are designed to appear authentic and often reference recent events or depict political figures in fabricated scenarios. Examples include a deepfake of Ukrainian President Zelensky suggesting NATO would supply Ukraine with Israel’s annual arms stockpile, a video portraying US President Biden as escalating the Ukraine war before Trump’s potential return to office, and another showing Trump forcing Zelensky to surrender. The campaign also spreads misinformation that mimics reputable media outlets, which suggest rising global support for Russia and doubts about Ukraine’s war strategies.
Analyst Comments:
Researchers also uncovered links between Operation Undercut and Storm-1516, a Russian influence network known for disinformation campaigns, including false narratives about the 2024 US elections. Notable examples of Storm-1516's contributions include a fake story about Zelensky buying a villa in Italy during an international summit and a deepfake video of a Hamas leader issuing threats against the 2024 Olympics. Recorded Future attributed Operation Undercut to SDA after identifying shared elements, such as cartoons resembling those found in SDA-leaked documents and Doppelgänger-affiliated websites.
While Operation Undercut shares some similarities with Doppelgänger, it uses distinct tactics. Doppelgänger primarily relies on fake websites promoted by bots, whereas Undercut focuses on directly posting content to social media and amplifying it through localized hashtag spam to target specific audiences. Despite the sophistication of these efforts, engagement with the content has been limited among its intended audiences. However, the US government has taken proactive measures, including seizing 32 domains associated with Doppelgänger. These domains, used to promote fake content and cycled through paid social media ads and influencers, demonstrate the evolving nature of SDA’s influence operations.
MItigation:
- News organizations should track content from known influence actors who are likely abusing their brand. Such abuse increases reputational risks and erodes consumer trust, potentially deterring advertisers and inflicting financial harm on the impersonated organizations.
- Political leaders and government officials from countries providing support to Ukraine should continue monitoring for content attempting to discredit them to identify and counter emerging narratives.
https://www.darkreading.com/cyberse...on-undercut-russia-malign-influence-campaigns
https://www.recordedfuture.com/rese...multifaceted-nature-sdas-influence-operations