Skip to content
Commentary | 2 August 2024

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Image of Konrad Seifert

Konrad Seifert |Co-CEO of the Simon Institute for Longterm Governance

Image of Nicolò Miotto

Nicolò Miotto |Project Assistant at the Organisation for Security and Co-operation in Europe (OSCE)

Arms Control Conventional Arms Control CTBT Diplomacy Emerging technologies Governance International Law NEVER Podcasts WMDs Emerging Disruptive Technologies and risk reduction NEVER

Listen on Spotify

Listen on Apple Podcasts 

Welcome to “Fake Brains & Killer Robots”, the fifth episode of “Ok Doomer!” the podcast series by The European Leadership Network’s (ELN) New European Voices on Existential Risk (NEVER) network. Hosted by the ELN’s Policy and Impact Director, Jane Kinninmont, and the ELN’s Project and Communications Coordinator, Edan Simpson, this episode will focus on the potential existential risks associated with artificial intelligence.

Jane kicks off the episode with “What’s the Problem?” We hear from Alice Saltini, a Policy Fellow at the European Leadership Network who has been focusing on the interactions between AI and nuclear command and control systems.

Alice discusses the immediate threats of AI, such as hallucinations and cyber vulnerabilities in nuclear command and control systems, emphasising the need for caution, regulation and international cooperation to mitigate the risks associated with AI and nuclear weapons.

Edan’s “How To Fix It” panel features Dr Ganna Pogrebna, Executive Director of the Artificial Intelligence and Cyber Futures Institute at Charles Sturt University in Australia. Ganna is also the Organiser of the Behavioural Data Science strand at the Alan Turing Institute, the United Kingdom’s national centre of excellence for AI and Data Science in London, where she serves as a fellow.

She’s joined by NEVER member Konrad Siefert. Konrad is co-CEO of the Simon Institute for Long-term Governance, which works to improve the international regime complex for governing rapid technological change and representing future generations in institutional design and policy processes. Previously, he co-founded Effective Altruism Switzerland.

Our third and final guest is NEVER member Nicolò Miotto; Nicolò currently works at the Organisation for Security and Co-operation in Europe (OSCE) Conflict Prevention Centre. Nicolò’s research foci include arms control, disarmament and non-proliferation, emerging disruptive technologies, and terrorism and violent extremism.

The panel discusses how best to govern, regulate, and limit the risks of AI and what that actually means; the role of multilateral institutions such as the UN in implementing these efforts; what potential opportunities and setbacks new forms of AI could have for arms control, especially regarding WMD proliferation; and to what extent AI developers are aware of the possible misuses of new technologies and how best to safeguard against them.

Moving on to “Turn Back the Clock,” we look back to a time in history when humanity faced a potential existential threat but pulled back from the brink of destruction. On today’s episode, Jane is joined by Dr Jochen Hung, Associate Professor of Cultural History at Utrecht University in the Netherlands. They discuss historical perspectives on technological change and its impact on society, drawing parallels between the anxieties and hopes of people in the 1920s concerning modern technologies, and those of the present day.

Finally, as always, the episode is wrapped up in “The Debrief,” where Jane and Edan review the episode to make sense of everything they’ve covered.

Catch up on previous episodes, and make sure to subscribe to future episodes of “Ok Doomer!”

Ok, Doomer! – Podcast page 

The opinions articulated above represent the views of the author(s) and do not necessarily reflect the position of the European Leadership Network or any of its members. The ELN’s aim is to encourage debates that will help develop Europe’s capacity to address the pressing foreign, defence, and security policy challenges of our time.