Skip to content

Filter

9 results found
Page 1 of 2
Commentary

Ok, Doomer! The NEVER podcast – How to save the world

Listen to the final episode of the NEVER podcast – Ok, Doomer! This episode takes a step back to assess what we’ve learned about existential and global catastrophic risks in previous episodes, and what comes next. Featuring a discussion of how ordinary people can get involved in existential risk mitigation, what ongoing efforts will prove most successful in creating a framework to deal on these topics, and how the example of the Nuclear Non-Proliferation Treaty demonstrates that global cooperation to deal with the biggest threats we all face is possible, even in a tense geopolitical climate.

Commentary

The fast and the deadly: When Artificial Intelligence meets Weapons of Mass Destruction

Ahead of the German Federal Foreign Office’s Artificial Intelligence and Weapons of Mass Destruction Conference 2024, the ELN’s Policy and Research Director, Oliver Meier, argues that governments should build guardrails around the integration of AI in the WMD sphere, and slow down the incorporation of AI into research, development, production, and planning for nuclear, biological, and chemical weapons.

27 June 2024 | Oliver Meier
Commentary

Ok, Doomer! The NEVER podcast – Biological threats: Going viral

Listen to the fourth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore biological existential and global catastrophic risks. Featuring an introduction to the topic and an exploration of some of the technological mitigation techniques we have available for biological risks, a panel discussion exploring the current state of the international governance of biological risks, and we will also delve into whether humanity’s recent brush with COVID-19 has better or worse prepared us for future pandemics, be they man-made or natural.

Commentary

The potential terrorist use of large language models for chemical and biological terrorism

In our latest New European Voices on Existential Risk (NEVER) commentary, Nicolò Miotto explores the potential existential risks stemming from the terrorist use of large language models (LLMs) and AI to manufacture chemical, biological, radiological and nuclear (CBRN) weapons. In the commentary he explores how LLMs and AI have enabled terrorist groups to enhance their capabilities so far, and what governments, the private sector, and NGOs need to do to mitigate future risks.

5 April 2024 | Nicolò Miotto
Commentary

Sounding the alarm on AI-enhanced bioweapons

In our latest commentary produced from our New European Voices on Existential Risk (NEVER) network, Rebecca Donaldson explores the potential of new technologies for security whilst minimising their potential for harm in the realms of AI and the life sciences. She proposes that more funds go towards the biological weapons convention, the creation of an Emerging Technology Utilisation and Response Unit (ETURU) and the fostering of a culture of AI assurance and responsible democratisation of biotechnologies.

26 February 2024 | Rebecca Donaldson
Commentary

3D printing and WMD terrorism: a threat in the making?

In our latest commentary from the ELN’s New European Voices on Existential Risk (NEVER) network, Nicolò Miotto examines developments in 3D printing technology and how these advances in its efficacy and accessibility, as well as its relationship to other emerging and disruptive technologies, have changed the threat landscape in regard to terrorists potentially obtaining WMDs, as well as what governments and the private sector need to do to tackle these emerging threats.

10 January 2024 | Nicolò Miotto