
Nuclear and New Technologies
In 2020, the European Leadership Network (ELN) in cooperation with partners set out on a journey to unpack technological complexity as it impacts nuclear decision-making and propose practical policy approaches to deal with related risks.
The challenge we want to address
Nuclear decision-making is complex. Disruptive technologies pose both risks and opportunities to nuclear decision-making which need to be better explained, understood, gamed, and mitigated. The project’s focus is on the – so far under-examined – implications of the technological complexity that emerges when nuclear decision-making is affected by a plethora of new technologies which are all evolving rapidly and simultaneously. Building on existing work that looks at the impact of individual technologies on nuclear policy, this project assesses the impact of these technologies in the aggregate, seeks to overcome related risks and explores opportunities offered by technologies to mitigate these risks.
Leveraging on the ELN’s deep expertise, convening power, and network of seasoned, high-level practitioners from multiple countries and utilizing ELN’s partner organizations strengths, we have embarked on a path to study, analyze, describe, train, and recommend decision-makers on nuclear policy challenges of technological complexities.
The project will develop, test-drive, propose and promote practical policy approaches that governments might pursue to begin to responsibly regulate and steer the weaponization of potentially disruptive technologies and their use in nuclear decision making.
The objectives of this multi-year project are to reduce risk in the nuclear decision-making, identify mitigation strategies, de-escalation solutions and manage potential and unintended escalation. We also strive to engage and raise the voice of younger generation experts in the discussion.
To commence work, the ELN in partnership with the German Federal Foreign Office has organized and hosted a “Rethinking Arms Control” workshop in March 2021. This closed-door meeting brought a diverse group of experts of scholars, practitioners, former nuclear weapons decision-makers, and emerging leaders to ideate and analyse the challenges, opportunities, and pitfalls of technological complexity. The summary of the proceedings and major takeaways from the workshop ARE highlighted in the following report: New Technologies, Complexity, Nuclear Decision Making and Arms Control: Workshop Report, June 2021
How we want to achieve the goal
The project is built upon four strands which – like four legs of a stool – support the main goal. These are:
- Baselining Exercise
- Big Data Analysis of Emerging and Disruptive Technologies
- Methodologies to Deal with Multi-tech Complexities
- Mitigation Strategies & Arms Control
We begin by asking what the science (strand 1), practioners (strand 2) and current policies and tools (strand 3) tell us about the impact of and ways of dealing with technological complexity in nuclear decision making. We then craft policy approaches that governments might pursue (strand 3 and 4).
This comprehensive approach allows us to unpack technological complexity by harnessing the brightest minds around the world, test policy approaches with people who “have been there and done it” and use our networks to develop and promote solutions with current decision-makers.
Funding from the German Federal Foreign Office, the MacArthur Foundation, the Carnegie Corporation of New York, the Nuclear Threat Initiative, the Heinrich Böll Foundation and in-kind contributions from project partners make this work possible.
Nuclear and New Technology Publications

From nuclear stability to AI safety: Why nuclear policy experts must help shape AI’s future
Artificial intelligence, much like nuclear technologies, has the capacity to transform our world for the better, offering breakthroughs in several fields whilst simultaneously posing catastrophic risks. Nuclear policy experts, skilled in managing existential threats, are well-suited to guide AI governance. ELN Network and Communications Manager Andrew Jones argues that urgent, coordinated international action and further collaboration between experts in the nuclear and AI fields is needed before AI outpaces our ability to control it.

Technological complexity and risk reduction: Using digital twins to navigate uncertainty in nuclear weapons decision-making and EDT landscapes
This policy brief explores the integration of digital twin technologies into nuclear decision-making processes, assessing their potential to reduce risks stemming from emerging disruptive technologies (EDTs). It argues for international dialogue, transparency, and responsible innovation to prevent misuse, enhance NC3 resilience, and strengthen strategic stability through informed, scenario-based crisis simulations.

From crisis to strategy: The OSCE and arms control in a divided Europe
Since Russia’s full-scale invasion of Ukraine in February 2022, the OSCE has faced a deep crisis. Russia and Belarus have violated key norms of the 1975 Helsinki Final Act, undermining the OSCE’s role in crisis management. Alexander Graef argues that breaking the impasse requires decisive political leadership and multi-level diplomacy. He also argues that growing military activities in Europe highlight the need for military-to-military contacts for managing escalation risks, in which the OSCE can facilitate necessary dialogues and support future monitoring activities as it has in the past.

How to save the world: Influencing policy on the biggest risks to humanity
A new report published from the European Leadership Network’s New European Voices on Existential Risk (NEVER) project calls for a systemic international approach to be taken to address man-made existential risk. The risks from nuclear weapons, climate change, biological threats, and AI are interconnected and cross-cutting lessons should be drawn.

Impact case study: NEVER mentoring
In 2024, the European Leadership Network (ELN) delivered an intergenerational networking programme designed to support emerging leaders in the fields of security, foreign policy, and existential risk. The programme brought together 25 participants from across the New European Voices on Existential Risks (NEVER) Network and the Younger Generation Leadership Network (YGLN).

Assessing the implications of integrating AI in nuclear decision-making systems
This policy brief analyses the integration of AI into nuclear command, control and communications systems (NC3), exploring potential benefits and significant risks. Former ELN policy fellow and Non-Resident Expert on AI at CNS Alice Saltini highlights the need for a better assessment of risks and the establishment of thresholds for integration to prevent miscalculations and nuclear escalation. It proposes that the EU leads international dialogue on AI risks in the nuclear domain in relevant international discussions.
-
European Leadership Network (ELN)
Go to website -
Federal Foreign Office
Go to website -
The Arms Control Association (ACA)
Go to website -
The Council on Strategic Risks (CSR)
Go to website -
The Center for Global Security Research (CGSR) at the Lawrence Livermore National Laboratory (LLNL)
Go to website -
The Oracle Partnership
Go to website -
Professor Andrew Futter, University of Leicester
Go to website -
The British American Security Information Council (BASIC)
Go to website -
The Heinrich-Böll-Stiftung (HBS)
Go to website -
The Younger Generation Leaders Network on Euro-Atlantic Security (YGLN)
Go to website -
Dr Vladimir Kozin (Analytical Agency “Strategic Stability")
Go to website