Published Date : 31/10/2025
On October 16, the inaugural United Nations Youth4Disarmament Forum, supported by the Government of the Republic of Korea, convened young leaders, nuclear experts, and diplomats for an in-depth expert panel on the “Destabilizing Effects of Artificial Intelligence and Information and Communications Technology on Nuclear Stability.”
The Expert Panel provided a unique platform for youth participants to engage interactively with experts on the complex intersection of nuclear weapons and emerging technologies. The session aimed to refine the participants' draft outcome document, which will be circulated with the UN General Assembly First Committee delegations in December 2025.
Moderator Ms. Umi Ariga-Maisy, a Research Fellow at the Japan Institute of International Affairs, set the stage by framing the discussion around the urgent need to understand the risks posed by the integration of AI, cyber technologies, and nuclear command systems (NC3s). The panel of experts and forum participants delved into the challenges and opportunities presented by the destabilizing effects of AI and ICTs on nuclear stability.
Ms. Dina Tawfik, a Fellow at the Center for Energy and Security Studies, highlighted recent studies showing that AI tends to choose more escalatory options. She also pointed out other risks, such as algorithmic opacity, automation biases, and misperception of nuclear weapons.
Mr. Giacomo Persi Paoli, Head of the Security and Technology Programme at the United Nations Institute for Disarmament Research (UNIDIR), emphasized that the developments in AI are often underestimated. He predicted that the next generation of AI in 2027 could outperform current capabilities. Despite the potential to avoid AI integration in NC3s, he noted that the integration into the nuclear decision-making process could introduce vulnerabilities from both security and cybersecurity perspectives.
Ms. Beyza Unal, Head of the Science and Technology Unit at the United Nations Office for Disarmament Affairs, outlined the numerous vulnerabilities at the intersection of AI, cyber, and nuclear systems. She pointed out that many NC3 and production infrastructures still rely on outdated technologies and fragile supply chains, leaving them susceptible to cybersecurity threats. She also stressed the security challenges posed by the rapid advancement of cyber technology compared to the slow integration of new technologies into existing nuclear weapons systems.
Ms. Patricia Jaworek, Director of the Global Nuclear Policy Program at the Nuclear Threat Initiative (NTI), explained how AI-related risks have started to feature in Nuclear Non-Proliferation Treaty (NPT) discussions and related initiatives. She highlighted the Stockholm Initiative’s contributions at the third Preparatory Committee and recent multilateral efforts, such as the 2024 Biden–Xi Joint Statement. She emphasized that the AI-nuclear nexus presents not only technological and structural challenges but also political and institutional ones.
Ms. Erika Campos, Second Secretary at the Permanent Mission of Brazil to the United Nations, discussed two resolutions before this year’s First Committee, addressing aspects of AI and emerging technologies. She encouraged engagement through mechanisms such as the Nuclear-Weapon-Free Zones and the Group of Governmental Experts on Nuclear Disarmament Verification to manage the issues of the AI-nuclear nexus.
Forum Participant Mr. Leonard Günzel, an Alumnus of the Responsible Innovation in AI for Peace and Security program at UNODA, underscored the lack of education in responsible engineering. He emphasized the need for responsible and ethical engineering education and called for greater engagement between the policy and engineering communities on societal issues.
During the interactive Q&A session, participants explored verification challenges, governance frameworks, and ways to bridge the gap between the policy and technical communities. Experts called for shared taxonomies, industry incentives for responsible AI, and the expansion of fellowship and training opportunities to foster interdisciplinary understanding. They suggested that discussions on AI be formally integrated into future NPT deliberations on transparency and accountability, stressing that lasting solutions will depend on the innovation and engagement of young leaders.
In closing, participants and experts underscored the importance of continued dialogue between youth, policymakers, and the scientific community to ensure stability between AI and nuclear technology. They emphasized that bridging the gap between rapidly emerging technology and governing spheres will be crucial to preventing miscalculations and maintaining global security.
Q: What was the main focus of the United Nations Youth4Disarmament Forum?
A: The main focus was on the destabilizing effects of Artificial Intelligence and Information and Communications Technology on nuclear stability, with a particular emphasis on the risks and challenges posed by the integration of these technologies into nuclear command systems.
Q: Who moderated the expert panel?
A: The panel was moderated by Ms. Umi Ariga-Maisy, a Research Fellow at the Japan Institute of International Affairs.
Q: What did Ms. Dina Tawfik highlight in her discussion?
A: Ms. Dina Tawfik, a Fellow at the Center for Energy and Security Studies, highlighted that recent studies show AI tends to choose more escalatory options and pointed out risks related to algorithmic opacity, automation biases, and misperception of nuclear weapons.
Q: What are the key vulnerabilities at the intersection of AI, cyber, and nuclear systems?
A: Key vulnerabilities include outdated technologies and fragile supply chains in NC3 and production infrastructures, leaving them susceptible to cybersecurity threats. The rapid advancement of cyber technology compared to the slow integration of new technologies into existing nuclear weapons systems also poses significant challenges.
Q: What role do young leaders play in addressing these challenges?
A: Young leaders play a crucial role in bridging the gap between rapidly emerging technology and governing spheres. They are encouraged to engage in interdisciplinary understanding and contribute to discussions on AI and nuclear stability, ensuring that solutions are innovative and effective.