Skip to content

Filter

75 results found
Page 4 of 13
Event

‘Technological Complexity and Risk Reduction: A Guardrails and checklist framework for EDTs in nuclear weapons decision-making’.

On 10th-11th April 2024, the ELN convened a group of diverse experts for a workshop at the German Federal Foreign Office to consider the core ingredients of a guardrails and checklist framework that will help policymakers anticipate and address challenges arising from Emerging and Disruptive Technologies (EDTs) and their aggregate effects on nuclear command, control, and communications (NC3) and nuclear weapons decision-making.

10 April 2024
Commentary

The potential terrorist use of large language models for chemical and biological terrorism

In our latest New European Voices on Existential Risk (NEVER) commentary, Nicolò Miotto explores the potential existential risks stemming from the terrorist use of large language models (LLMs) and AI to manufacture chemical, biological, radiological and nuclear (CBRN) weapons. In the commentary he explores how LLMs and AI have enabled terrorist groups to enhance their capabilities so far, and what governments, the private sector, and NGOs need to do to mitigate future risks.

5 April 2024 | Nicolò Miotto
Commentary

Ok, Doomer! The NEVER Podcast – Climate change: A hot topic

Listen to the third episode of the NEVER podcast – Ok, Doomer! In this episode, we explore climate change; the existential risk that the general public is most familiar with. Featuring an exploration of how the climate crisis affects politics on the local, national and international levels, climate change as a “polycrisis”, and how the world in the past has managed to unite around policies that combat climate change such as closing the hole in the ozone layer.

Commentary

Navigating cyber vulnerabilities in AI-enabled military systems

As countries continue incorporating AI into conventional military systems, they should prepare themselves for the risk that adversaries are likely already working to exploit weaknesses in AI models by threatening datasets at the core of AI. To address this, Alice Saltini writes that states should develop metrics to assess how cyber vulnerabilities could impact AI integration.

19 March 2024 | Alice Saltini
Commentary

Sounding the alarm on AI-enhanced bioweapons

In our latest commentary produced from our New European Voices on Existential Risk (NEVER) network, Rebecca Donaldson explores the potential of new technologies for security whilst minimising their potential for harm in the realms of AI and the life sciences. She proposes that more funds go towards the biological weapons convention, the creation of an Emerging Technology Utilisation and Response Unit (ETURU) and the fostering of a culture of AI assurance and responsible democratisation of biotechnologies.

26 February 2024 | Rebecca Donaldson