Skip to content

Nicolò Miotto

Associate Expert at UNOOSA

Nicolò Miotto currently works at the United Nations Office for Outer Space Affairs (UNOOSA) and is also pursuing a PhD at the University of Vienna. Prior to joining UNOOSA, he worked at the Organisation for Security and Co-operation in Europe (OSCE) Conflict Prevention Centre (CPC) and at Macquarie University in Sydney, Australia.

Nicolò is the winner of the 2021 OSCE–IFSH Essay Competition on “Conventional Arms Control and Confidence- and Security-Building Measures.” His essay explores the potential application of blockchain technology to improve conventional arms control. He is also one of the winners of the fourth edition of the IAI (Istituto Affari Internazionali) Prize “Young Talents for Italy, Europe and the World.”

He holds a Bachelor’s degree in International and Diplomatic Sciences from the University of Trieste, Italy. He received a Master’s degree in Intelligence, Security and Strategic Studies (IMSISS) from the University of Glasgow (United Kingdom), Dublin City University (Ireland), and Charles University (Czech Republic).

Nicolò’s research interests include space security, arms control, disarmament and non-proliferation, and terrorism and violent extremism. His research has been published in the Journal for Deradicalisation, the Journal for Peace and Nuclear Disarmament, and Critical Sociology.

Content by Nicolò Miotto

Report

How to save the world: Influencing policy on the biggest risks to humanity

A new report published from the European Leadership Network’s New European Voices on Existential Risk (NEVER) project calls for a systemic international approach to be taken to address man-made existential risk. The risks from nuclear weapons, climate change, biological threats, and AI are interconnected and cross-cutting lessons should be drawn.

Commentary

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Listen to the fifth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore artificial intelligence and its relationship with existential risk. Featuring an introduction to the topic, why we should be especially wary when integrating AI with nuclear weapons systems, the role of AI in arms control, how best to regulate AI on the global level and what international institutions are best placed to do so, as well as historical perspectives on technological change and its impact on our cultural understandings of existential risk.