Legal Limbo Leaves Killer Robots Off-Leash

Photo: iStock/com/Stefanocar75

The UN’s Convention on Conventional Weapons (CCW) Group of Governmental Experts (GGE) met last week to discuss lethal autonomous weapons systems. But while most member states called for a legally-binding process to ensure that some form of meaningful human control be maintained over these prospective weapons systems, there is a sense of distrust among states that could fuel an artificial intelligence and robotics arms race.

  • A total of 86 countries participated in the meetings, including the UN Institute for Disarmament Research (UNIDIR), the International Committee of the Red Cross (ICRC) and the Campaign to Stop Killer Robots, a coalition of over 60 civil society groups across 26 countries coordinated by Human Rights Watch. The goal of the Campaign to Stop Killer Robots is broadly to prohibit the development, production and use of lethal autonomous weapons systems in order to retain meaningful human control over life-and-death decisions in battle, policing and other situations.
  • The Pentagon, in 2012 directive, has laid out the U.S. policy that such weapons systems should allow “appropriate levels of human judgment” over the use of lethal force. But what constitutes “appropriate” remains unclear and the national policy could change depending on the evolving capabilities of other states, particularly China and Russia.
  • One of the major contentions in last week’s GGE discussions was the definition of what a fully autonomous weapons system actually is. The Netherlands proposed this definition: “a weapon that, without human intervention, selects and engages targets matching certain predefined criteria, following a human decision to deploy the weapon on the understanding that an attack, once launched, cannot be stopped by human intervention.” No formal agreements on a definition emerged from the GGE discussions.
  • States will determine steps forward at the annual CCW meeting on Friday, November 24, but a few prominent states, including the U.S. and Russia, believe it is too soon to begin negotiating new arms control measures for weapons that do not yet exist.

Doug Wise, former Deputy Director, Defense Intelligence Agency

“The United States, as part of the community of nations, ought to support the UN effort. What we cannot do is allow our participation in that as it evolves to govern our own ability to defend our citizens. I recognize that at some point we have to moderate our behavior, but self-preservation has to trump diplomacy – it has to be that way.”

Notably, autonomous systems are not the same as automatic systems, though automated systems are the natural precursor to futuristic lethal autonomous weapons. While the distinction appears conceptually clear, agreeing on where that line between autonomous and automatic weapons systems resides is a main focus of discussions at the United Nations.

Paul Scharre, former Special Assistant to the Under Secretary of Defense for Policy

“It’s generally hopeless trying to clearly distinguish between automatic, automated and autonomous systems. We use those words to refer to different points along a spectrum of complexity and sophistication of systems. They mean slightly different things, but there aren’t clear dividing lines between them. One person’s “automated” system is another person’s “autonomous” system. I think it is more fruitful to think about which functions are automated/autonomous.”

  • Systems such as the Phalanx Close-In Weapon System, Israel’s Iron Dome missile defense system, or South Korea’s Samsung SGR-A1 border sentries all represent automatic weapons systems, which are primarily static defensive capabilities preprogrammed to target inanimate objects such as incoming munitions when reaction times would often overwhelm human operators.
  • Autonomous systems, while not yet developed, would be mobile weapons platforms equipped with onboard sensors and computing systems capable of decision-making through algorithms to seek, identify, track and attack a variety of targets by adapting to their surroundings after being activated. The United Kingdom’s Taranis, France’s nEUROn, and the American X-47B are all examples of a push toward autonomy in unmanned aerial combat vehicles.

Doug Wise, former Deputy Director, Defense Intelligence Agency

“There are human beings that actually fly the MQ-9 drone – people are actually observing and make the decisions to either continue to observe or use whatever is the lethality that is inherent in the platform. There are human beings at every stage. Now lets assume that at some point the human beings release the platform to act on its own recognizance, which is based on the basic information on the payload that it carries and the information that it continues to be updated with. Then it is allowed to behave in a timescale to take data, process it, and make decisions and act on those decisions. As the platforms become more sophisticated, our ability to let it go will become earlier and earlier.”

The prospective development of autonomous weapons is largely driven by states that believe they need these weapons capabilities for self-preservation out of fear that adversarial states also will develop them – a spiraling cycle that breeds a global arms race. “Artificial intelligence is the future, not only for Russia, but for all humankind,” Russian President Vladimir Putin said at the beginning of September when speaking to students. “It comes with colossal opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become the ruler of the world.”

  • Weapons autonomy removes requirements of a communication link to remotely controlled weapons systems such as unmanned aerial systems (UAS), which can prompt command delays and is vulnerable to electromagnetic disruption, such as spoofing or jamming, as well as capture that could reveal the drone’s location and sensor feeds. Operating in contested airspace, as has become apparent over eastern Ukraine and is a central component of China’s anti-access area denial (A2/AD) strategy in the South China Sea, will require some level of autonomy should unmanned aircraft wish to remain a viable operational tool.
  • Autonomous weapons close the time gap between action and reaction by enabling decision-making at speeds likely incomprehensible to humans – giving states with autonomous weapons capabilities a clear tactical advantage on the battlefield.
  • Unlike humans, autonomous systems conceptually know no fear, stress, fatigue, nor are prone to emotional overreaction or self-preservation instincts. Such qualities, in the right hands, might render warfare more humane by preventing militaries from committing atrocities of war. Greater restraint coupled with enhanced discrimination between civilians and lawful combatants could result in the use of force more aligned with international humanitarian law.

Robert Bunker, Adjunct Research Professor, Strategic Studies Institute, U.S. Army War College

“More advanced expert and AI based lethal autonomous systems will ultimately be both platform specific as well as network/cloud residing. I can readily see a capital warship at some point in the future with its own AI battle management system being used for automated shipboard defenses. It is going to have to go in this direction as human decision making, even at the human on the loop level, will simply be too slow from an OODA (Observe-Orient-Decide-Act) loop perspective to respond to attacks coordinated by offensive AI systems.”

While there are qualities that could drive militaries to develop lethal autonomous weapons, some of those same qualities give credence to groups seeking to apply preventive arms control measures. Many of the operational advantages militaries could gain from the development of autonomous weapons also present operational risks.

  • The high-tempo and adaptive decision making of autonomous systems could lead to unpredictable outcomes similar to flash crashes in the financial trading sector. The possibility of high-speed friendly fire by autonomous weapons could mean militaries will retain a human mechanism as a fail-safe.
  • Autonomous systems could also be destabilizing by engaging in quick, disarming surprise attacks, where a swarm of small, cheap, 3-D printed drones could fly under the radar and target strategic command-and-control systems or even nuclear capabilities – disturbing a fragile strategic balance of power and perhaps escalating a diplomatic crisis to all out war, without human intervention.
  • Given the proliferation of drone technologies, either through national development or commercial and military export, it is likely autonomous weapons will also eventually find their ways into the hands of nefarious actors. Much like how autonomous weapons could enable more precise targeting to differentiate friend from foe, in the hands of oppressive states or terrorist groups, these systems could wreak havoc on an unsuspecting civilian population.

Robert Bunker, Adjunct Research Professor, Strategic Studies Institute, U.S. Army War College

“Semi- and fully autonomous lethal are not necessarily democracy-enhancing and may become an autocratic despot’s best friend – think machine mercenaries that you don’t have to pay for their loyalty.”

  • Should autonomous weapons engage in unlawful acts, the question of accountability becomes a challenge – are humans liable for the harm caused by a weapon acting independently? While decisions to use lethal force are already decentralized in kinetic drones strikes, accountability would be even more diluted for autonomous weapons – and developers and weapons manufacturers are often granted immunity. This “accountability gap” could weaken deterrence of unlawful acts and leave victims without any avenue of recourse.

Despite ongoing discussions at the multilateral level within the CCW, it remains uncertain whether states will take meaningful action to prevent a global arms race toward autonomous weapons – a prospect that could be a defining characteristic of global security for years to come.

Levi Maxey is a cyber and technology analyst at The Cipher Brief. Follow him Twitter @lemax13.


Share your point of view

Your comment will be posted pending moderator approval. No ad hominem attacks will be posted. Your email address will not be published. Required fields are marked *