Skip to content

Nicolò Miotto

Project Assistant at the Organisation for Security and Co-operation in Europe (OSCE)

Nicolò Miotto currently works at the Organisation for Security and Co-operation in Europe (OSCE) Conflict Prevention Centre (CPC), and he is also studying for his PhD at the University of Vienna. Prior to joining the OSCE, he worked as a visiting scholar at Macquarie University in Sydney (Australia), researching terrorism and violent extremism. He also conducted research on cybersecurity and terrorism at the Centre for Security Studies (CSS) in Sarajevo (Bosnia and Herzegovina).

Nicolò is the winner of the 2021 OSCE-IFSH Essay Competition on “Conventional Arms Control and Confidence- and Security-Building Measures.” His essay explores the potential application of blockchain technology to improve conventional arms control. He is also one of the winners of the fourth edition of the IAI (Istituto Affari Internazionali) Prize “Young Talents for Italy, Europe and the World.”

He holds a Bachelor’s degree in International and Diplomatic Sciences from the University of Trieste (Italy). He received a Master’s degree in Intelligence, Security and Strategic Studies (IMSISS) from the University of Glasgow (United Kingdom), Dublin City University (Ireland) and Charles University (Czech Republic).

Nicolò’s research interests include arms control, disarmament and non-proliferation, and terrorism and violent extremism. His research has been published in the Journal for Deradicalisation, the Journal for Peace and Nuclear Disarmament and the Geneva Centre for Security Policy (GCSP).

Content by Nicolò Miotto

Report

How to save the world: Influencing policy on the biggest risks to humanity

A new report published from the European Leadership Network’s New European Voices on Existential Risk (NEVER) project calls for a systemic international approach to be taken to address man-made existential risk. The risks from nuclear weapons, climate change, biological threats, and AI are interconnected and cross-cutting lessons should be drawn.

Commentary

Ok, Doomer! The NEVER podcast – Fake brains and killer robots

Listen to the fifth episode of the NEVER podcast – Ok, Doomer! In this episode, we explore artificial intelligence and its relationship with existential risk. Featuring an introduction to the topic, why we should be especially wary when integrating AI with nuclear weapons systems, the role of AI in arms control, how best to regulate AI on the global level and what international institutions are best placed to do so, as well as historical perspectives on technological change and its impact on our cultural understandings of existential risk.