SUBSCRIBER+EXCLUSIVE INTERVIEW — While the use of lethal drones is escalating in Ukraine, Russia, the Middle East and elsewhere, the global proliferation of the technology and advances in artificial intelligence have raised concerns about the threat posed by so-called autonomous weapons – drones that destroy targets with limited human intervention, relying on algorithms and onboard sensors.
As human control or oversight is reduced, fundamental questions arise: Can these autonomous systems be trusted to make spur-of-the-moment judgments, differentiate between friend and foe, and minimize collateral damage? There is particular concern about so-called “drone swarms” - in which large numbers of cheap autonomous weapons could be deployed against adversaries to devastating effect.
Army Gen. Erik Kurilla, the head of U.S. Central Command, warned the Senate Armed Services Committee in early March that such drones are “one of the top threats” facing the U.S., and drone swarms an even “bigger concern.” Kurilla said the U.S. should invest in defenses against mass drone attacks, such as high-powered microwave weapons. The Pentagon has been doing just that, having signed at least five deals with contractors that explicitly reference drone swarming, according to a report on procurement contracts from the Center for Security and Emerging Technology.
Diplomatic activity aimed at regulating autonomous weapons systems is also quickening. The United Nations General Assembly (UNGA) adopted Resolution 78/241 in December 2023, which called for a rigorous study on challenges from lethal autonomous weapons systems. The U.S. convened a meeting in March to discuss the issue further, building on a political declaration on ethical usage of military AI. The UNGA will debate the topic again in the fall.
Some top experts believe that the fears of “killer robots” with “minds of their own” may be overblown - or at least premature. Gen. Philip Breedlove, former Supreme Allied Commander of NATO, told The Cipher Brief that he believes “there are no truly autonomous weapons,” in the sense that there is always a human involved at some stage of the deployment of such weapons, whether it be in the targeting or training phases.
But Gen. Breedlove also worries that America's adversaries may approach the technology with minimal care. “One of the things that I think is going to be telling in the future is how hard we (the U.S.) fight to use these sorts of weapons and to avoid collateral damage," he said, "whereas our opponents are not going to be concerned with that."
Gen. Breedlove was interviewed by Cipher Brief reporter Ken Hughes.
THE CONTEXT
- The U.S. released a Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy on November 1, 2023. The declaration, which has 53 signatories, aims to establish a framework to regulate military AI.
- The UN General Assembly adopted Resolution 78/241 in December 2023, which urges the UN secretary-general to seek views of countries and other stakeholders on challenges from lethal autonomous weapons systems. The resolution received 152 votes in favor. Only Belarus, India, Mali and Russia voted no. Another 12 states abstained.
- The Israeli Defense Forces’ Lavender AI system has been used to identify tens of thousands of targets for strikes in Gaza. Civilians have reportedly been killed in resulting strikes.
THE INTERVIEW
This interview has been lightly edited for length and clarity
Gen. Breedlove: I think that the term autonomous weapons is really gray. It is my opinion that there are no truly autonomous weapons. There are always humans in the loop in some places, and even those that claim to be fully AI are normally what we call machine learning applications, not artificial intelligence. So one of the things that I do when I talk to people on the subject is, I just like to clarify language – and so you tell me what you mean by autonomous weapons.
The Cipher Brief: Let's say aerial drones, for example, that are sent out by a human operator, but from the point really that they begin their mission, they select their own targets and through different types of capabilities, identify the target and conduct an attack. So from the point of departure, the weapon does not have a human operator in the loop.
Gen. Breedlove: Sort of a human chooses the target area, right?
The Cipher Brief: That's correct.
Gen. Breedlove: A human chooses the target. A human teaches the machine how to recognize the target and a human sets, what we would call for humans, rules of engagement. But a human sets the logic inside the machine for what, when and where to attack. I do not believe there are truly autonomous weapons.
I do believe there are man-configured, man-trained and man-launched and man-designated weapons that then go out into man-designated areas and situations and kill targets that they autonomously recognize as the target.
The Cipher Brief: What do you think our adversaries, our rivals are up to in this field?
Gen. Breedlove: The first big thing that I would point out is that there are going to be people who build these weapons, and they are going to be nearly completely unconcerned with collateral damage, and they are going to use them to their advantage no matter the possible consequences. We are imminently concerned with controlling the application of force to minimize collateral damage. China, Russia, North Korea, they're not going to care.
These weapons are so much better than humans. We can build multi-spectral sensors that can see way better than we can and interpret and discern way better than we can. But what those weapons are not good at is seeing the bus full of school children about to roll by the target. And that's why we have men in the loop for so many things that we do. And when we have a mistake like we did during the retreat from Afghanistan and we end up killing the wrong target, we are incredibly critical of ourselves and look at how that came about. We look at something like that for months and agonize over how it happened. Our enemies will do it multiple times in a day and never blink an eye. And so as we look at the way these weapons are going to be used on the battlefield, we are going to expect that western nations with western values are going to be far more limited in the application than our opponents, who will be imminently less concerned with collateral damage in the way these weapons will be used.
Secondarily, we will build the weapons that we use in these cases probably in a more sophisticated way because we're going to want to try to be able to discern when not to strike and when to strike. Our opponents will build cheaper weapons because they're not going to be concerned about the things we are concerned about.
The Cipher Brief: When you look at China and Russia in particular, do they seem to be oriented toward a particular type of weapons development? The most familiar one, at least to the public, are aerial drones, but I'm reading more and more about maritime weapons systems and I was wondering if any of those fields seems more likely than another for our adversaries deployments
Gen. Breedlove: I think right now because of what's going on in Ukraine, that Russia is most concerned about relatively unsophisticated kamikaze aerial drones. And we see on the flip side, Ukraine being very concerned with the kind of drones that they could use under sea, on the sea, but they are rarely autonomous. They are targeted. The more sophisticated ones are targeted by humans but then become autonomous in their attack, if that makes sense.
The human is a big part of saying, Lock onto that. And then the machine uses predetermined, evasive kinds of maneuvers to survive contact with the enemy. You don't have to watch too many of these latest kamikaze boat things that Ukraine has used to basically take control of the Northern Black Sea (to understand that) they don't have a single capital ship. They're doing it completely with drones. Basically Ukraine does not have a Navy and they're creating a naval power with drones.
The Cipher Brief: How about China, and China's endeavors here?
Gen. Breedlove: China is very, very much into rocketry - missiles to hold the U.S. Air Force and U.S. Navy at bay in the South China Sea, their famous carrier killer missiles and so forth. These are another form of basically autonomous (weapons). They're fired and aimed by a human, but then they take on their own characteristics in flight to get to the target.
The Cipher Brief: One of the things I've read about the Chinese in particular is their investment in developing aerial drones that can operate in concert. In other words, the drone swarm.
Gen. Breedlove: I've heard the same – and I know that we're doing the same thing. So I think they're much in line with the kind of research that we're doing. I would tell you that no one that I know has fought a (drone) swarm yet except maybe Ukraine. And you should look at the sinking of the (Russian ship) Moskva, because in the very beginning of that conflict on the Northern Black Sea, Ukraine used a swarm of aerial drones to distract the people protecting the Moskva, and they were looking up when they got hit from the surface. So the Ukrainians used a swarm to distract, and then they used naval drones to kill. They used the aerial drones to enable the sea drones.
The Cipher Brief: If you could imagine sort of a worst-case scenario from the U.S. perspective, what would you fear that our adversaries could develop in autonomous platforms?
Gen. Breedlove: Well, the bottom line is we are going to have to completely rethink the way we defend against drone armies – on land, air, sea, and undersea. No one is going to compete with some of our capital assets. It will be decades, many decades before the Chinese catch up in stealth and in long-range bomber capability; they eventually will, but it will be way out there. They will probably never catch up truly in carrier technology, et cetera. But these large capital assets are, I think more vulnerable than we know to drones and new swarming and drone attack profiles. And so we are going to have to really reshift our emphasis on those type of threat vectors. And I think we have a ways to go yet.
The Cipher Brief: That's an excellent point – being prepared for sort of almost a radical change in the nature of warfare.
Gen. Breedlove: Warfare is all about new types of weapons and then it's the reaction to those weapons and then it's the reaction to the reaction of those weapons. This is just the next iteration and there'll be iterations after this. And that's what modern military business is about. It's about adapting and then readapting.
The Cipher Brief: There are different types of conventions, treaties and agreements that are being done, unilaterally or in small groups and at the UN, to impose some sort of limitations or norms on autonomous weapons development. Do you have any view of how successful those might be or how aggressively the US should pursue such an international understanding?
Gen. Breedlove: I believe those conversations are ongoing. I've been asked more than once to be the lone person explaining killer robots to groups who are advocating for limiting killer robots. And I never really advocate one way or the other. What I try to do is clean up misconceptions and bad or imprecise language. But I would also point out that often when the West agrees to these kinds of limitations, the people who we are fighting do not adhere to the agreements. So as I said before, one of the things that is going to be telling in the future is how hard we fight to use these weapons and to avoid collateral damage, whereas our opponents are not going to be concerned with that.
I am pretty aware of the landscape and I don't think there's any significant progress that anybody's willing to sign up to yet as far as trying to develop conventions. And then even if we do develop those conventions, who adheres to those things is going to be an interesting next concern.
Sign up for the Cyber Initiatives Group newsletter. Better results in cyber require better thinking. Sign up for the CIG newsletter today.
Read more expert-driven national security insights, perspective and analysis in The Cipher Brief because National Security is Everyone’s Business.