Powered by Humanitix
More dates

Battlefield Trust for Human-Machine Teaming: Evidence from the US Military

Event description

Discussing AI, Automated Systems, and the Future of War Seminar Series

Experts agree that future warfare will be characterized by countries’ use of military technologies enhanced with Artificial Intelligence (AI). These AI-enhanced capabilities are thought to help countries maintain lethal overmatch of adversaries, especially when used in concert with humans. Yet it is unclear what shapes servicemembers’ trust in human-machine teaming, wherein they partner with AI-enhanced military technologies to optimize battlefield performance. In October 2023, Dr Lushenko administered a conjoint survey at the US Army and Naval War Colleges to assess how varying features of AI-enhanced military technologies shape servicemembers’ trust in human-machine teaming. He finds that trust in AI-enhanced military technologies is shaped by a tightly calibrated set of considerations including technical specifications, namely their non-lethal purpose, heightened precision, and human oversight; perceived effectiveness in terms of civilian protection, force protection, and mission accomplishment; and, international oversight. These results provide the first experimental evidence of military attitudes for manned-unmanned teams, which have research, policy, and modernization implications.


About the speaker
Lieutenant Colonel Paul Lushenko,
 PhD is an Assistant Professor and Director of Special Operations at the US Army War College. In addition, he is a Council on Foreign Relations Term Member, Senior Fellow at Cornell University's Tech Policy Institute, Non-Resident Expert at RegulatingAI, and Adjunct Research Lecturer at Charles Sturt University. He is the co-editor of Drones and Global Order: Implications of Remote Warfare for International Society (2022), which is the first book to systematically study the implications of drone warfare on global politics. He is also the co-author of The Legitimacy of Drone Warfare: Evaluating Public Perceptions (2024), which examines public perceptions of the legitimacy of drones and how this affects countries’ policies on and the global governance of drone warfare.

About the chair
Emily Hitchman is the Research Officer on the Anticipating the Future of War: AI, Automated Systems, and Resort-to-Force Decision Making project. Emily is a PhD scholar at the Strategic and Defence Studies Centre focussing on the history of the Glomar (‘neither confirm nor deny’) response in the national security context. She is also a 2023 Sir Roland Wilson Scholar, and has appeared on the National Security Podcast speaking about her research, and as a panellist at the 2022 Australian Crisis Simulation Summit speaking about the future of intelligence. Emily has worked professionally across the national security and criminal justice public policy space, including in law enforcement and cyber policy, and holds a Bachelor of Philosophy from The Australian National University.



This seminar series is part of a two-year (2023-2025) research project on Anticipating the Future of War: AI, Automated Systems, and Resort-to-Force Decision Making, generously funded by the Australian Department of Defence and led by Professor Toni Erskine from the Coral Bell School of Asia Pacific Affairs.

If you require accessibility accommodations or a visitor Personal Emergency Evacuation Plan please contact bell.marketing@anu.edu.au.




Powered by

Tickets for good, not greed Humanitix donates 100% of profits from booking fees to charity




Refund policy

No refund policy specified.