EXPERT PERSPECTIVE: The presence of drones/UAV’s in armed conflicts as varied as Syria, Nagorno-Karabakh, and Ukraine is not only influencing military tactics and strategy but it's also raising questions about proliferation, arms control, and legal and ethical limits to these weapons.
In the first part of our series, Expanding Horizons for Drone Technology, we looked at how the drone has become the iconic weapon in the War in Ukraine. In today's look at drone/UAV technology, we turn to questions of policy.
Drones are relatively easy to acquire and modify, difficult to detect, and many are able to be operated without specialized training. Consequently, the unique capabilities of unmanned and autonomous devices – either for use on the battlefield or in civil applications – pose thought-provoking challenges to governments, manufacturers, and consumers.
National and international mechanisms exist to manage and capitalize on the promise of drone/UAV technology. Commercial and personal applications also introduce unique concerns, especially with regard to privacy rights and humanitarian considerations.
The Cipher Brief tapped three experts with different backgrounds and perspectives to bring these challenges into focus.
Sarah Kreps, Director of the Cornell Tech Policy Lab, Cornell University
Sarah Kreps is the John L. Wetherill Professor in the Department of Government, Adjunct Professor of Law, and the Director of the Cornell Tech Policy Lab at Cornell University. Her teaching and research focus is on the intersection of international politics, technology, and national security.
Zachary Kallenborn, Policy Fellow at the Schar School of Policy and Government, George Mason University
Zachary Kallenborn is a Policy Fellow at the Schar School of Policy and Government, a Research Affiliate with the Unconventional Weapons and Technology Division of the National Consortium for the Study of Terrorism and Responses to Terrorism (START), an officially proclaimed U.S. Army "Mad Scientist," and a Senior Consultant at ABS Group.
Col. Christopher Reid, Military Fellow, International Security Program, Center for Strategic and International Studies (CSIS)
Colonel Christopher K. Reid is an active-duty Air Force officer with more than 20 years of experience as a command and control operator, military operational planner, and air battle manager. Col. Reid has worked extensively with coalition air forces, providing air and missile defense expertise to the Royal Jordanian Air Force and more recently as the deputy chief of the Combat Plans Division for U.S. Air Forces Central Command during the height of hostilities against the Islamic State.
The Cipher Brief: How do you assess the scope and effectiveness of current international and bilateral protocols and standards governing drones/UAV’s?
Kreps: The current protocol consists of the Missile Technology Control Regime (MTCR), which was established in the Cold War to regulate the transfer of potential nuclear vehicles. Drones were considered in the same category of cruise missiles as a possible nuclear-delivery vehicle that should not proliferate widely. The MTCR stipulated that countries should control the transfer of any drones with a payload of over 500 kg or that could travel a distance of 300 km. This would include the Predator and Reaper that the United States has used widely in conflict, and for the most part the U.S. was restrictive with its sale of these medium-altitude, long endurance drones and turned down requests from countries like the UAE and Saudi Arabia.
The MTCR is not a treaty, however, and is not binding. Plus it doesn’t include major drone manufacturers like China and Israel. And Turkey, which has become a major drone exporter, does not appear to be acting as though it is constrained by the regime.
I would say that the MTCR initially slowed the spread of the most lethal drones but now this technology is proliferating widely and it’s not clear that there are protocols or regimes that any of the major drone-producing/exporting countries have any incentive to develop because it would come at a cost to their home industries.
Kallenborn: The main concern [of the MTCR] are drones capable of carrying nuclear weapons; large, weaponized drones aren’t much of a worry so long as they don’t meet those high thresholds. The U.N. Register also asks for transparency over UAV arms transfers, but does not include drones operating in other domains, like unmanned surface or ground vehicles that are increasingly being built and fielded.
I don’t know much about bilateral protocols, but I can comfortably say it’s a matter of open debate. The Trump Administration loosened restrictions around exporting drones generally. Turkey’s trade of the Bayraktar TB-2 shows clearly that armed drones can be a quite useful tool of foreign policy to strengthen allies and build relationships.
Two big issues that need to be addressed are autonomy and swarming. Activities have been calling for bans on autonomous weapons generally. Drones aren’t necessarily autonomous, but they are obvious vehicles for advancing autonomous in weapons generally. I believe a comprehensive ban on autonomous weapons is a mistake; I prefer a risk-based approach. Drone swarming where drone communication integrates drones into a single, connected weapon also needs to be considered seriously. The combination of error risk and mass harm creates risks akin to traditional weapons of mass destruction.
Reid: First, I believe technological advancement in sensors and excitement over current and future uses of artificial intelligence and automated processing will start driving some separation in this conversation towards responsible and moral applications vice a focus on delivery platforms. The Law of War will continue to be guided by historical lessons from the Geneva Conventions of 1949 and additional ratifications over the years, as well as the DoD Law of War Program, codified in DoD Directive 2311.01. All flight activity, to include UAV use in peacetime, will continue being limited, or graded by adherence to internationally recognized borders. The Department of Defense and U.S. Intelligence Community face challenges in leading the world with responsible policies that determine appropriate use of technologies in the “gray zone,” and in the competition phase with countries known to have defeating policies and undermining objectives towards the United States, their Allies, partners, and the international world order.
This conundrum is not new, as it has long been fought in mediums with largely oblique norms in space and cyberspace. The war that Russia initiated in sovereign Ukraine has resurfaced criticism on the U.S. export policy of UAVs. Lt. Gen. (ret) David Deptula recently argued inForbes that the Missile Technology Control Regime (MCTR) is outdated, as it ties UAVs to systems capable of nuclear strike. In reality, it is likely that U.S. agencies feared potential unregulated uses of UAVs by other nations once systems proliferate – for a myriad of other potential applications not nuclear – since current UAVs aren’t capable of delivering these types of payloads, as Deptula acutely clarifies. But while U.S. policy has been paralyzed in fear, the reality is that UAVs are easy and cheap to build, and other nations like China are capitalizing on the opportunity. Ukraine continues to send number of delegations to Washington arguing for re-examination of export policy.
This is a perfect example of where the debate centers, or should center – application of sensors, use of automated processing, and ethical standards for artificial intelligence (AI) in all of its forms.The DoD Directive, 3000.09, establishes Department policy on autonomy in weapons systems. It is broad enough and suitable to not require a re-write for U.S. military application. Deputy Defense Secretary Kathleen Hicks released a memo on responsible use of AI in May 2021, complementing DoD AI ethics principles, and this is a leading step to defining international norms for application of these technologies.
The debate on governing protocols, ethical limits, and export policies of UAVs really centers on what is still largely undefined space internationally on responsible use of sensor technology, rather than the delivery platform itself. Policymakers will need to examine how specific they wish to be, balanced against freedom of action luxuries that suit national interests, and a realization that UAV proliferation is occurring today – a reality that will determine whether the U.S. is a leader in defining responsible use of sensors (with our allies and partners), or if we will lag peer competitors due to policy paralysis from extremely tight restrictions and risk acceptance.
Sign up for the Cyber Initiatives Group quarterly digital magazine to stay ahead of public-private collaboration in cyber. The future depends on it. Sign up today.