Who Should Give the Kill Command?

Military drone close up view, flying high in the sky

The Department of Defense is seeking a significant budget increase for further development of lethal autonomous weapons programs (LAWS).  While technology and the use of weapons technology is advancing rapidly, policy makers are grappling with the questions of just which decisions machines should be making on their own.  Should an autonomous weapon be making life and death decisions?  How much input should be reserved for humans?

After protests in Silicon Valley over the way that machine learning is being used in lethal systems, the Pentagon is now seeking input.  It has tasked the Defense Innovation Board, made up largely of Silicon Valley executives, to provide guidelines for the application of machine learning in future wars that will likely rely heavily on machines making decisions.  

Access all of The Cipher Brief’s national security-focused expert insight by becoming a Cipher Brief Subscriber+ Member.

Sign Up Log In


Related Articles

Search

Close