Imagine a weapon that strikes on its own, without the slightest hint of human judgment or compassion. It’s a chilling prospect, and not so far in the future unless the world calls a halt to such weapons’ development.
Since the Campaign to Stop Killer Robots opened in April 2013, it has been encouraging countries to affirm the need for meaningful human control of weapons systems by drawing a line to preemptively ban fully autonomous weapons. Retaining human control over use of force is a moral imperative and essential to promote compliance with international law, and ensure accountability.
Nations have been discussing the issue for three years under the mantle of the Convention on Conventional Weapons. A breakthrough came at the end of 2016, when countries taking part in the treaty’s five-year Review Conference agreed to formalize their deliberations on lethal autonomous weapons systems. Toward that end, the conference’s final document establishes a Group of Governmental Experts chaired by Ambassador Amandeep Gill of India. Approximately 90 countries are expected to participate in its meetings this year at the United Nations in Geneva along with representatives from UN agencies, the International Committee of the Red Cross, and Campaign to Stop Killer Robots.
Russia at first strenuously objected to forming the new group as “premature,” because countries have not agreed on a working definition of lethal autonomous weapons systems. At the last minute, though, Moscow said it would not block consensus on formalizing the process, clearing the way for approval.
Another noteworthy development in 2016 was China’s December publication of its first policy position on fully autonomous weapons. The paper finds “uncertainties” in the adequacy of international law to address fully autonomous weapons and recommends the development of a legally binding instrument, citing the precedent of the 1995 Convention on Conventional Weapons protocol that preemptively banned blinding lasers. China is the first permanent member of the UN Security Council to find that new international law is needed on fully autonomous weapons.
Advancing to the next diplomatic level provides the public with hope that nations are serious about coming up with a timely response to the threat of fully autonomous weapons. Under the Group of Governmental Experts, countries can begin negotiating to create new international law on the weapons. The tools are in place for getting the job done with a complete ban. The issue now is whether the nations can muster the will before it’s too late.
Autonomous weapons systems are in development by more than a dozen countries, particularly the United States, China, Israel, South Korea, Russia, and the United Kingdom. The concern is that the human role in selecting and firing on targets will become less and less prominent until humans are no longer involved and the machine takes over these critical functions.
As a recent Human Rights Watch report found, use of such weapons would cross a moral threshold. The humanitarian and security risks would outweigh any possible military benefit. Critics dismissing these concerns depend on speculative arguments about the future of technology and the false presumption that technical advances can address the many dangers posed by these future weapons.
Even though there was widespread support for formalizing the process to discuss concerns over killer robots, countries such as France, the United Kingdom, and the United States have set the bar far too low. They have not called for new international law, but instead have proposed focusing on sharing best practices and greater transparency in the development and acquisition of new weapons systems. That is not enough to stop the development of these weapons before it’s too late.
The Group of Governmental Experts is expected to drill further down into substantive concerns, including the notion of meaningful or appropriate or adequate human control, as well as approaches to defining fully autonomous weapons, and an exploration of the options for action. Gill, a veteran arms control negotiator and an electrical engineer by training, says that India’s objective is to work with all stakeholders to further “strengthen” the framework convention so that it can keep “proving its resilience as a dynamic instrument of international humanitarian law.”
The treaty does not have to be the only international tool to address this issue, as the work of the UN Special Rapporteur on extrajudicial, summary or arbitrary executions shows us. The UN Human Rights Council can play an important role, especially given the concern that fully autonomous weapons would probably not only be used in warfare, but in policing, border control, and other circumstances.
Over the coming year the Campaign to Stop Killer Robots will work to create a better understanding of the potential danger from fully autonomous weapons as well as to increase support for a preemptive ban. During 2016, the group of nations calling for a preemptive ban on fully autonomous weapons doubled, expanding from nine to 19 states. At the April meeting of treaty countries on the topic Algeria, Chile, Costa Rica, Mexico, and Nicaragua called for a ban, while Argentina, Guatemala, Panama, Peru, and Venezuela endorsed the ban call during the Review Conference. In the United States in December, nine House Democrats led by Representative Jim McGovern (D-MA) called for a ban on fully autonomous weapons in a letter that urged U.S. support for the process set up under the Convention on Conventional Weapons.
It looks like 2017 could be a year for substantial progress in tackling these weapons systems. The Campaign to Stop Killer Robots aims to expand its outreach in national capitals so that diplomats have the necessary instructions to participate substantively in the international process and to ensure they work for a preemptive ban on fully autonomous weapons. Such a prohibition is now firmly within reach.