Skip to content

Filter

19 results found
Page 1 of 4
Commentary

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Listen to the fifth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore artificial intelligence and its relationship with existential risk. Featuring an introduction to the topic, why we should be especially wary when integrating AI with nuclear weapons systems, the role of AI in arms control, how best to regulate AI on the global level and what international institutions are best placed to do so, as well as historical perspectives on technological change and its impact on our cultural understandings of existential risk.

Commentary

The EU’s Artificial Intelligence Act – a golden opportunity for global AI regulation

Today, the European Union’s Artificial Intelligence Act comes into force. NEVER member and Digital Policy Consultant Julie Lübken argues that the EU’s AI Act, pre-dating similar efforts to regulate AI in the UK and the US, could lay the groundwork for global AI governance. She writes that the legislation should also mark the start of a discussion with other countries, such as China, for a worldwide effort, engaging with policymakers, businesses, and civil society, to better understand AI, harness its potential and mitigate risk.

1 August 2024 | Julie Lübken
Commentary

Ok, Doomer! The NEVER podcast – Biological threats: Going viral

Listen to the fourth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore biological existential and global catastrophic risks. Featuring an introduction to the topic and an exploration of some of the technological mitigation techniques we have available for biological risks, a panel discussion exploring the current state of the international governance of biological risks, and we will also delve into whether humanity’s recent brush with COVID-19 has better or worse prepared us for future pandemics, be they man-made or natural.

Commentary

Unstable systems: Why geoengineering will solve neither climate change nor climate geopolitics

As more attention is paid to geoengineering technologies that claim to mitigate the existential risks posed by climate change, Jakob Gomolka, from our New European Voices on Existential Risk (NEVER) network, argues that policymakers needs to understand the geopolitical implications of these technologies, let alone their climatic side-effects, and calls for more alignment in the international governance of geoengineering technologies.

6 June 2024 | Jakob Gomolka
Commentary

The potential terrorist use of large language models for chemical and biological terrorism

In our latest New European Voices on Existential Risk (NEVER) commentary, Nicolò Miotto explores the potential existential risks stemming from the terrorist use of large language models (LLMs) and AI to manufacture chemical, biological, radiological and nuclear (CBRN) weapons. In the commentary he explores how LLMs and AI have enabled terrorist groups to enhance their capabilities so far, and what governments, the private sector, and NGOs need to do to mitigate future risks.

5 April 2024 | Nicolò Miotto
Commentary

Ok, Doomer! The NEVER Podcast – Climate change: A hot topic

Listen to the third episode of the NEVER podcast – Ok, Doomer! In this episode, we explore climate change; the existential risk that the general public is most familiar with. Featuring an exploration of how the climate crisis affects politics on the local, national and international levels, climate change as a “polycrisis”, and how the world in the past has managed to unite around policies that combat climate change such as closing the hole in the ozone layer.