Killer Robots Drive Concern but Odds of Ban Less Clear

By Paul Scharre

Paul Scharre is a Senior Fellow and Director of the Technology and National Security Program at the Center for a New American Security. He is author of Army of None: Autonomous Weapons and the Future of WarFrom 2008-2013, Mr. Scharre worked in the Office of the Secretary of Defense (OSD) where he played a leading role in establishing policies on unmanned and autonomous systems and emerging weapons technologies. He served as Special Assistant to the Under Secretary of Defense for Policy.

The past week has seen a flurry of news stories on “killer robots,” which wouldn’t be complete without the obligatory Terminator and Robocop images. Countries were supposed to meet this month at the United Nations to discuss lethal autonomous weapons (aka “killer robots”), but meetings have been delayed till November due to funding shortfalls. Instead, we’ve been treated to a spike in media interest courtesy of an open letter signed by over 100 robotics and artificial intelligence company CEOs expressing concern about lethal autonomous weapons. You might be forgiven for thinking amidst all of this noise that momentum is building towards a ban. A careful reading of the letter suggests a different story, though.

Numerous sources have portrayed the letter as calling for an international treaty banning autonomous weapons, but a few astute observers have noted it does not. Instead, the letter warns of the dangers of autonomous weapons but leaves the solution vague. The letter’s authors conclude by asking countries engaged in UN discussions to “find a way to protect us from all these dangers.” As Yale’s Rebecca Crootof points out, the nuance was “clearly intentional.” A similar letter in 2015 with many of the same signatories called for “a ban on offensive autonomous weapons beyond meaningful human control.” The Campaign to Stop Killer Robots unambiguously has advocated for a “comprehensive, pre-emptive prohibition on the development, production and use of fully autonomous weapons.” This makes the mushy call to action in the recent letter all the more striking. The letter makes no bones about the dangers of autonomous weapons, calling them a “Pandora’s Box” that should never be opened. Yet it notably does not call for an international treaty to ban them.

“The Cipher Brief has become the most popular outlet for former intelligence officers; no media outlet is even a close second to The Cipher Brief in terms of the number of articles published by formers.” —Sept. 2018, Studies in Intelligence, Vol. 62

Access all of The Cipher Brief’s national security-focused expert insight by becoming a Cipher Brief Subscriber+ Member.

Subscriber+

Categorized as:InternationalTagged with:

Related Articles

How Safe Would We Be Without Section 702?

SUBSCRIBER+EXCLUSIVE INTERVIEW — A provision of the Foreign Intelligence Surveillance Act that has generated controversy around fears of the potential for abuse has proven to be crucial […] More

Search

Close