Skip to content

Nuclear and New Technologies

In 2020, the European Leadership Network (ELN) in cooperation with partners set out on a journey to unpack technological complexity as it impacts nuclear decision-making and propose practical policy approaches to deal with related risks.

Project Team

The challenge we want to address

Nuclear decision-making is complex. Disruptive technologies pose both risks and opportunities to nuclear decision-making which need to be better explained, understood, gamed, and mitigated. The project’s focus is on the – so far under-examined – implications of the technological complexity that emerges when nuclear decision-making is affected by a plethora of new technologies which are all evolving rapidly and simultaneously. Building on existing work that looks at the impact of individual technologies on nuclear policy, this project assesses the impact of these technologies in the aggregate, seeks to overcome related risks and explores opportunities offered by technologies to mitigate these risks.

Leveraging on the ELN’s deep expertise, convening power, and network of seasoned, high-level practitioners from multiple countries and utilizing ELN’s partner organizations strengths, we have embarked on a path to study, analyze, describe, train, and recommend decision-makers on nuclear policy challenges of technological complexities.

The project will develop, test-drive, propose and promote practical policy approaches that governments might pursue to begin to responsibly regulate and steer the weaponization of potentially disruptive technologies and their use in nuclear decision making.

The objectives of this multi-year project are to reduce risk in the nuclear decision-making, identify mitigation strategies, de-escalation solutions and manage potential and unintended escalation. We also strive to engage and raise the voice of younger generation experts in the discussion.

To commence work, the ELN in partnership with the German Federal Foreign Office has organized and hosted a “Rethinking Arms Control” workshop in March 2021. This closed-door meeting brought a diverse group of experts of scholars, practitioners, former nuclear weapons decision-makers, and emerging leaders to ideate and analyse the challenges, opportunities, and pitfalls of technological complexity. The summary of the proceedings and major takeaways from the workshop ARE highlighted in the following report: New Technologies, Complexity, Nuclear Decision Making and Arms Control: Workshop Report, June 2021

How we want to achieve the goal

The project is built upon four strands which – like four legs of a stool – support the main goal. These are:

  1. Baselining Exercise
  2. Big Data Analysis of Emerging and Disruptive Technologies
  3. Methodologies to Deal with Multi-tech Complexities
  4. Mitigation Strategies & Arms Control

We begin by asking what the science (strand 1), practioners (strand 2) and current policies and tools (strand 3) tell us about the impact of and ways of dealing with technological complexity in nuclear decision making. We then craft policy approaches that governments might pursue (strand 3 and 4).

This comprehensive approach allows us to unpack technological complexity by harnessing the brightest minds around the world, test policy approaches with people who “have been there and done it” and use our networks to develop and promote solutions with current decision-makers.

Funding from the German Federal Foreign Office, the MacArthur Foundation, the Carnegie Corporation of New York, the Nuclear Threat Initiative, the Heinrich Böll Foundation and in-kind contributions from project partners make this work possible.

The Four Strands

Strand 1: Baselining Exercise

What can expert literature tell us about the nexus between technological complexity and nuclear weapons decision-making in peacetime, crisis and war? How does expert thinking in the West, Russia and China differ?

Drawing on 75 open-source English-language literature sources available as of the end of 2020, the Center for Global Security Research at the Lawrence Livermore National Laboratory (LLNL) conducted a baselining study on emerging and disruptive technologies and the complexity challenge:

We are working on a similar study capturing Russian and Chinese language expert literature.

 Strand 2: Big Data Analysis of Emerging and Disruptive Technologies

To facilitate a better understanding of technological complexity in nuclear weapons decision making practice, the ELN in cooperation with the Oracle Partnership – a world-leading strategic foresight expert – hosted a pilot workshop. It aimed to develop and initiate a process in which scenario-design and big data interacts with high level practitioners to generate insight into the unprecedented complexity now increasingly presented by emerging and disruptive technologies operating in aggregate on the interface with nuclear decision making.

Based upon a “worst case scenario” build upon a comprehensive technological trend analysis using state-of-the-art artificial intelligence (AI) tools, 20 former high-level experienced nuclear practitioners (NATO SecGens, Joint Chiefs of Staff, SACEUR, MoD, MFA) explored challenges that an emerging technological environment would pose on a nuclear decision maker.

The process validated assumptions that technological complexity will create additional uncertainties, compress decision making time, generate unforeseen risks but also offer new opportunities. This high-level group of experts underscored the lack of sufficient understanding of the technologies and their implications by decision makers, wide concern about autonomous decision making, a strong desire to keep a human in the loop and a conservative approach to change within the decision-making process itself.

Believing that nuclear-decision makers of tomorrow are with us today, the ELN partnered with the Oracle Partnership, the British American Security Information Council (BASIC), the Emerging Voices Network (EVN), the Younger Generation Leaders Network on Euro-Atlantic Security (YGLN) and the Heinrich-Böll-Stiftung to expose a younger generation of experts to engage them in a similar conversation about impact, resulting risks and opportunities of technological complexity for nuclear decision making. The younger generation brought a different perspective of, and expectations from, emerging and disruptive technologies and, unburdened by deep involvement in nuclear decision making, offered a unique perspective on the challenge we face.

Strand 3: Methodologies to Deal with Multi-Tech Complexities

In this strand we look at how technological complexity will affect the nuclear decision-maker and develop methodologies that could help deal with technological complexity. The Council of Strategic Risks is leading this work.

 Strand 4: Mitigation Strategies and Arms Control

This strand of work analyzes risks for nuclear stability posed by individual technologies alongside those of other technologies, in particular cyber offensive capabilities, hypersonic weapon systems, space weapons, artificial intelligence, drones and LAWS. It will develop practical policy options that governments might pursue to mitigate related risks and offer possible “arms control” mechanisms for regulation and use of emerging and disruptive technologies. The Arms Control Association is leading this work.

Nuclear and New Technology Publications

Commentary

Nuclear vs cyber deterrence: why the UK should invest more in its cyber capabilities and less in nuclear deterrence

The threats the UK faces today are more nuanced and diverse than in the Cold War era, ranging from state-sponsored cyber-attacks to sophisticated disinformation campaigns. ELN Policy Fellow Nikita Gryazin argues that these challenges require a shift in focus from traditional nuclear deterrence to modern defensive and offensive cyber capabilities.

23 September 2024 | Nikita Gryazin
Commentary

Ok, Doomer! The NEVER podcast – How to save the world

Listen to the final episode of the NEVER podcast – Ok, Doomer! This episode takes a step back to assess what we’ve learned about existential and global catastrophic risks in previous episodes, and what comes next. Featuring a discussion of how ordinary people can get involved in existential risk mitigation, what ongoing efforts will prove most successful in creating a framework to deal on these topics, and how the example of the Nuclear Non-Proliferation Treaty demonstrates that global cooperation to deal with the biggest threats we all face is possible, even in a tense geopolitical climate.

Commentary

Nuclear posture and cyber threats: Why deterrence by punishment is not credible – and what to do about it

The United Kingdom’s latest nuclear doctrine suggests that severe cyber-attacks on their national or critical infrastructure could provoke a nuclear response. Despite this, cyber-attacks against the UK have surged over the past decade. Eva-Nour Repussard, YGLN member and Policy Fellow at BASIC, writes that instead of deterrence by punishment, the UK should seek to increase its resilience to cyber-attacks and focus on a strategy of deterrence by denial regarding cyber threats.

19 September 2024 | Eva-Nour Repussard
Report

Workshop report: The OSCE and its role in strengthening European security architecture

In July 2024, the ELN convened a group of international experts to explore the historic, current, and future role of the Organisation for Security and Co-operation in Europe (OSCE) and its toolbox in maintaining and strengthening European security architecture. This workshop report details the participants’ discussions on the OSCE’s history, its operations in Ukraine since Russia’s invasion of Crimea in 2014, the role of the OSCE in conflict management and post-conflict normalisation, organisational successes and failures, and its unique status as a platform for dialogue between the West and Russia.

15 August 2024 | Maria Branea
Commentary

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Listen to the fifth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore artificial intelligence and its relationship with existential risk. Featuring an introduction to the topic, why we should be especially wary when integrating AI with nuclear weapons systems, the role of AI in arms control, how best to regulate AI on the global level and what international institutions are best placed to do so, as well as historical perspectives on technological change and its impact on our cultural understandings of existential risk.

Commentary

The EU’s Artificial Intelligence Act – a golden opportunity for global AI regulation

Today, the European Union’s Artificial Intelligence Act comes into force. NEVER member and Digital Policy Consultant Julie Lübken argues that the EU’s AI Act, pre-dating similar efforts to regulate AI in the UK and the US, could lay the groundwork for global AI governance. She writes that the legislation should also mark the start of a discussion with other countries, such as China, for a worldwide effort, engaging with policymakers, businesses, and civil society, to better understand AI, harness its potential and mitigate risk.

1 August 2024 | Julie Lübken

Project Partners

The project team would like to thank the following organisations and people for making this programme possible:

Interested in the project?

For more information on the Nuclear and New Technologies project, please contact Project Lead Katarzyna Kubiak