Yesterday, I joined a Fionn session with Dave Snowden — one of the side discussions at the Dublin Knowledge Summit at Trinity College.
A Fionn (Irish for “wisdom”) is a small, informal group session — focused on live discussion and exchange of ideas.
Many of you will know Snowden as the creator of the Cynefin Framework — first developed in the early 2000s while working with the UK Ministry of Defence. The goal: to help military commanders and staff move beyond linear, reductionist thinking when operating in high-uncertainty, rapidly changing situations — where traditional planning and doctrine often fail.
Cynefin offered language and structure to distinguish between different types of situations — simple, complicated, complex, and chaotic — and to adjust leadership and decision-making accordingly.
It resonated widely across militaries because it addressed a real operational problem: why structured plans break down in the field, and why leaders need to shift from relying on procedures to adapting in real time.
Here’s what stood out from yesterday’s session — and why I think it matters right now:
1️⃣ Humans Don’t Think in Mental Models
“Forget about mental models — computers have models. Humans don’t.”
We don’t “think through models.” We improvise. We hallucinate. We make sense through imperfect memory, emotion, multisensory cues, gut instinct, and embodied experience.
Trying to map human decision-making as neat “models” leads to brittle plans that fail under pressure.
→ For commanders and strategists:
Stop believing better doctrine or AI models will ‘fix’ decision-making.
Invest in preparing for messiness, anomalies, friction, and surprise.
2️⃣ Conflict is Critical
“If you don’t deliberately create the right kind of friction, you risk creating the wrong kind of conflict — or teams will simply go with the flow and stop thinking.”
Conflict (in the right form) is necessary to trigger conscious cognition.
Safety cultures that aim to eliminate all conflict end up dulling awareness and reducing adaptive capacity.
True psychological safety should enable the right kind of friction — differences that challenge thinking without personal threat — which is essential for anomaly detection and critical decision-making.
3️⃣ Simulation is Superior
“Worst practice beats best practice.”
“Winning makes you scan less. Losing makes you scan more.”
Snowden’s war games (Anthro-Stimulation) force teams to experience failure, surprise, and adapt — building deep learning that no static Lessons Learned process or case library can match.
After failure, teams scan 20x more data before making decisions — sharpening their ability to sense anomalies and adapt in the moment.
It’s not about capturing lessons — it’s about training human adaptability through exposure to surprise and failure.
4️⃣ Why Physical Interaction is Important
Human trust and sense-making are built on physical cues: tone, micro-expressions, body language, resonance.
Virtual environments strip away these layers — and with them, true situational awareness.
→ Implication:
Digital decision frameworks must account for what is lost in remote, screen-based ops.
Some knowledge — and trust — will never fully digitize.
5️⃣ How We Actually Decide
“The brain engages only when it detects an anomaly.”
“Perfect memory does not create new knowledge.”
We operate through:
• Micro-hallucinations — imperfect, reconstructed memory
• Pattern recognition — multisensory, embodied
• Anomaly detection — what jolts the brain into conscious processing
AI systems — built on perfect memory and model logic — do not mirror this human way of deciding.
They can assist, but not replace, the inherently embodied, improvisational, and friction-driven way leaders operate in complex environments.
→ For foresight practitioners:
When working on how human + AI decision cycles interact, create conditions for anomaly sensing, narrative layers, friction loops, trust-building, and learning through loss.
These are the real ingredients of resilience and adaptive capacity.
6️⃣ Stop Building from Case Studies (lessons learned)
“Cases are context-specific and historical — using them as recipes is a disaster.”
The military habit of building learning from case studies, lessons learned, after-action reports assume the future will look like the past.
In reality, future conflict spaces will be riddled with ambiguity, surprise, and unrecognizable patterns.
Static case libraries degrade readiness.
→ Instead:
• Design simulation-based learning
• Train people to recognize patterns and adapt across contexts — not copy past cases
• Build crews with strong role-based expertise — not ad hoc teams
Closing Thought:
Dave Snowden’s message is about how humans think, adapt, and make decisions in complex, uncertain environments.
If we want to be future-ready — human, machine, and hybrid — we need to stop building brittle models and start designing for:
• Messiness
• Friction
• Embodied interaction
• Anomaly detection
• Role-based decision-making
• Learning through failure
That is the deeper shift required.
Howth, Ireland — from my hike today. No good conference photos… just datasets.