Skip to content
Report | 27 February 2026

Towards a better understanding of human bias in nuclear decision-making and its interaction with emerging and disruptive technologies

Bias is a critical yet under-addressed factor in nuclear decision-making. Cognitive shortcuts become particularly problematic in crises, where uncertainty, incomplete information, and the gravity of potential consequences can distort judgment. Historical cases such as the Cuban Missile Crisis illustrate how misperception and rigid assumptions have brought nuclear-armed states close to catastrophe. Today, the challenge is compounded by the growing role of AI and automated decision-support systems, which promise speed and precision but also introduce new forms of bias that can influence human reasoning and strategic choices.

This report by Behavioural Data Scientist Ganna Pogrebna and ELN Senior Policy Fellow Rishi Paul presents findings from an ELN workshop that examined the ‘human’ and ‘machine’ components of bias and their points of interaction. Through structured discussions and a prototype digital twin simulation, designed to replicate key features of nuclear crisis decision-making, including uncertain information flows and AI-assisted inputs, participants drawn from the ELN Young Generation Leadership Network (YGLN), including military advisors, behavioural scientists, and technology developers, reflected on how group dynamics, ambiguity, and machine-generated outputs shape judgment under pressure.

The workshop did not present AI or automated decision-support systems as passive or neutral inputs to crisis decision-making. Instead, these tools were deliberately embedded in the simulation environment as active elements designed to provoke reflection on how machine-generated outputs might shape human judgment under stress. The goal was not to evaluate the technical accuracy of a particular model, but to explore how the presence of an ostensibly intelligent system could influence group dynamics, the perceived credibility of information, and the confidence underpinning high-stakes decisions.

The report highlights how human judgment and AI systems can interact in ways that reinforce, rather than reduce, risk. Ganna Pogrebna and Rishi Paul

While not prescriptive, the report highlights how human judgment and AI systems can interact in ways that reinforce, rather than reduce, risk. It is intended to inform policymakers and stakeholders by surfacing these dynamics at the intersection of nuclear weapons and emerging disruptive technologies, with the goal of deepening understanding and supporting more informed debate.

Key workshop insights included:

Human biases: Seven recurrent biases were identified as particularly relevant to nuclear crises: illusion of control, inherent bad faith, peaceful defensive images, perceptual bias, interpersonal bias, overconfidence, and worst-case thinking. Each was shown to distort strategic reasoning and escalation choices in distinctive ways.

Technology as a bias modulator: AI was neither purely corrective nor purely distorting. Instead, it functioned as a ‘bias modulator’, sometimes reinforcing overconfidence and premature certainty, while at other times exposing hidden assumptions or broadening consideration of alternatives. Trust in AI was shown to be highly selective, embraced when reinforcing existing views, dismissed when it contradicted them.

Group dynamics: Authority bias, groupthink, and sycophancy were recognised as pervasive, with participants noting how hierarchical cultures can silence dissent. Decision outcomes were often shaped more by those who spoke with confidence than by data.

Design principles for bias mitigation: The workshop highlighted that effective nuclear related decision environments must be designed to expose bias rather than conceal it.

Read the report

The European Leadership Network itself as an institution holds no formal policy positions. The opinions articulated in this report represent the views of the author rather than the European Leadership Network or its members. The ELN aims to encourage debates that will help develop Europe’s capacity to address the pressing foreign, defence, and security policy challenges of our time, to further its charitable purposes.

Image: Alamy, Kiyoshi Takahase Segundo