NATO’s Strategic Communications Centre of Excellence has published one of the most significant intelligence briefings of 2025. The Virtual Manipulation Brief doesn’t predict what might happen—it documents what already is.
Drawing on over 11 million posts from ten major platforms, the report outlines how pro-Russian and pro-Chinese actors are running AI-enhanced, real-time influence campaigns. These are not scattered efforts or bot-farm anomalies. They are synchronized systems optimized for speed, reach, and emotional impact.
Strategic Targets, Timed with Precision
The primary targets of these influence campaigns are:
NATO
Ukraine
The European Union
The United States
Content often appears within 15 minutes of breaking news, indicating pre-coordinated narratives rather than spontaneous response. These bursts are strategically deployed to shape early perception and dominate emerging discourse.
The Platform Relay: Coordinated Across Channels
The campaigns operate like relay races across platforms. The report maps a typical progression:
X (formerly Twitter) → Rapid seeding and amplification via verified accounts and botnets
Telegram → Emotional repackaging, voice/video clips, community clustering
YouTube, VK, Instagram → AI-generated video, memes, and narrative framing
Facebook → Familiar formats like quote cards and protest imagery reintroduced for mainstream diffusion
Each platform plays a specific role in this influence architecture—seeding, amplifying, reframing, or normalizing the message.
AI Is No Longer Optional—It’s Central
Generative AI is embedded in nearly every phase of these campaigns. The report identifies:
AI-generated protest footage, blending real and synthetic video
Billboards fabricated using AI, featuring Western leaders and manipulated slogans
Quote cards falsely attributed to high-profile figures used to trigger anti-surveillance or anti-NATO sentiments
These assets are designed for plausibility and shareability, bypassing conventional disinformation detection tools.
Emotional Engineering at Operational Scale
The emotional tone of these campaigns is deliberately tailored by region:
In Europe: Themes of fear, decay, and collapse
In North America: Narratives of betrayal, corruption, and elite conspiracy
Globally: A blend of despair, decline, and loss of sovereignty
Emotion, not information, is the primary vehicle—and AI is how it scales.
What This Means Strategically
This isn’t just about misinformation—it’s about the rise of operationalized, AI-coordinated influence infrastructure.
It’s fast, modular, emotionally tuned, and designed to bypass traditional detection. NATO’s report makes it clear: we are no longer dealing with isolated propaganda. We are facing system-level manipulation campaigns, designed to shape perception before institutions can respond.
The Next Layer of Resilience
So what now?
The challenge isn’t how to fact-check after the fact. It’s how to build anticipatory narrative infrastructure—systems that can detect and redirect hostile coordination before it influences public perception.
This requires moving beyond monitoring, toward semantic defence and strategic narrative agility.
The battlefield is semantic. The weapon is attention. And the frontline may already be in your feed.