Skip to content

Filter

22 results found
Page 1 of 4
Commentary

Deterrence without destruction: Rethinking responses to biological threats

Scientific advances have renewed a discussion around the possibility of potentially devastating biological attacks. Eva Siegmann writes that nuclear deterrence is inadequate to deter biological threats. Instead, the threat of biological weapons should be addressed via international efforts rooted in transparency and cooperation. Leveraging the mechanisms of the Biological Weapons Convention and implementing deterrence-by-denial strategies can effectively mitigate risks.

28 November 2024 | Eva Siegmann
Podcast

Ok, Doomer! The NEVER podcast – Nukes and new tech

In this special bonus episode of the NEVER podcast – Ok, Doomer!, we take a deep dive into the ELN’s Nuclear and New Technologies project, which aims to identify the impacts of emerging and disruptive technologies (EDT) on nuclear decision-making and present practical steps to mitigate the potentially disruptive effects. Featuring discussions on why it’s best to explore the aggregate impact of EDTs as well as examining them individually, an introduction to the ELN’s Guardrails and Self-Assessment (GSA) Framework for Emerging and Disruptive Technologies (EDTs), and the history of nuclear fail-safe reviews.

Commentary

Ok, Doomer! The NEVER podcast – How to save the world

Listen to the final episode of the NEVER podcast – Ok, Doomer! This episode takes a step back to assess what we’ve learned about existential and global catastrophic risks in previous episodes, and what comes next. Featuring a discussion of how ordinary people can get involved in existential risk mitigation, what ongoing efforts will prove most successful in creating a framework to deal on these topics, and how the example of the Nuclear Non-Proliferation Treaty demonstrates that global cooperation to deal with the biggest threats we all face is possible, even in a tense geopolitical climate.

Commentary

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Listen to the fifth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore artificial intelligence and its relationship with existential risk. Featuring an introduction to the topic, why we should be especially wary when integrating AI with nuclear weapons systems, the role of AI in arms control, how best to regulate AI on the global level and what international institutions are best placed to do so, as well as historical perspectives on technological change and its impact on our cultural understandings of existential risk.

Commentary

The EU’s Artificial Intelligence Act: A golden opportunity for global AI regulation

Today, the European Union’s Artificial Intelligence Act comes into force. NEVER member and Digital Policy Consultant Julie Lübken argues that the EU’s AI Act, pre-dating similar efforts to regulate AI in the UK and the US, could lay the groundwork for global AI governance. She writes that the legislation should also mark the start of a discussion with other countries, such as China, for a worldwide effort, engaging with policymakers, businesses, and civil society, to better understand AI, harness its potential and mitigate risk.

1 August 2024 | Julie Lübken
Commentary

Ok, Doomer! The NEVER podcast – Biological threats: Going viral

Listen to the fourth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore biological existential and global catastrophic risks. Featuring an introduction to the topic and an exploration of some of the technological mitigation techniques we have available for biological risks, a panel discussion exploring the current state of the international governance of biological risks, and we will also delve into whether humanity’s recent brush with COVID-19 has better or worse prepared us for future pandemics, be they man-made or natural.