Few U.S. military assets inspire the executive-branch swagger that unmanned aerial vehicles -- also known as drones -- do.
"If I'm president of the United States and you're thinking about joining al-Qaida or ISIL, I'm not going to call a judge," Sen. Lindsey Graham, the Republican presidential hopeful from South Carolina, recently declared. "I'm going to call a drone, and we will kill you."
Already, the Obama administration's selective deployment of lethal drones has been conducted with an intensity that borders on enthusiasm. A Council on Foreign Relations fellow estimated that U.S. drone strikes outside of Iraq and Afghanistan had killed over 3,600 people by 2015. The Bureau of Investigative Journalism reports that roughly 4,000 people were killed in Pakistan and Yemen alone between 2004 and 2014.
Drones are a popular weapons system. As former director of the CIA's National Clandestine Service Jose Rodriguez observed: "Drones can be a highly effective way of dealing with high-priority targets, but they should not become the drug of choice for an administration that is afraid to use successful, legal and safe tactics of the past."
That is a good point, but it is also yesterday's metaphor. The smart money from the defense community's artificial "intelligence crowd" is betting that tomorrow's drones -- with a few digital tweaks of today's -- will be computationally capable of dealing death from the sky with minimal human involvement.
Removing the humans
Drones are quickly becoming smarter, more discriminyating and more capable of making real-time life-and-death decisions without humans in the loop. Today's Predators and Reapers depend on people in ways that tomorrow's will not. With improved sensors, software and machine learning algorithms, drones are rapidly evolving into autonomous or quasi-autonomous systems. If appropriately programmed or adequately trained, these systems should be able to cost-effectively conduct sophisticated military operations, including targeted missile strikes and assassinations, on their own.
Or, as one machine learning expert has observed, "We think our smartest autonomous systems will be able to execute better judgment faster than a typical soldier or pilot."
Members of the "autonomy design community" note that cockpit automation systems in many commercial airplanes effectively fly planes with a bare minimum of oversight from human pilots. Militaries around the world, meanwhile, already possess "fire and forget" weaponry, where the main human function is to identify the target and then simply press a button.
But the ongoing evolution of automated systems technologies and capabilities promises to have a transformational impact on drones. The difference between drones of today and those of tomorrow could be as wide as the human difference between an ordinary grunt instructed to follow specific orders and a specialized Navy SEAL or Delta Force operative being empowered to improvise and adapt on the fly in order to achieve his mission. The special operators are not only capable of doing more, they are trained to take advantage of opportunities to do more.
The same goes for autonomous drones. Often, a target of opportunity is only available for a few moments. By the time a human operator recognizes and cognitively processes that brief chance, it could have vanished. But machines trained on facial recognition software and other sensing modalities, for example, could react -- or fire -- faster than a human overseer. To offer an analogy, a quasi-autonomous drone would be seen as less of an unthinking tool and more of computationally gifted "comrade in arms" capable of destroying enemies from kilometers away.
A truly versatile drone -- or swarm of drones -- might have the ability to decide which course of action makes the most tactical sense for a given situation, whether it be greater firepower, some form of nonlethal intervention (such as taking out mobile phone towers or jamming communications frequencies), or calling for assistance from drones or humans. Improved machine autonomy may well lead to greater human flexibility.
Implications of autonomy
This push for more autonomous drones is not driven by the dystopian visions of Hollywood "Terminator" or digital "Dr. Strangeloves," nor is it the pursuit of technology for its own sake. Drones have proven so effective that they have increasingly come under all manner of attack. Hostile forces seek to electronically disrupt or destroy communications between drones and their remote human controllers from hundreds, if not thousands, of kilometers away. There are physical efforts to destroy and neutralize drones, as well.
As a result, drones increasingly need to be able to immediately protect themselves without the dangerous delay of waiting for a distant human decision. The result is that autonomy for defensive protection and evasion has already become an essential criterion for drone design. Of course, the line between "defensive" and "offensive" autonomous capabilities quickly blurs. What is self-defense for a drone can at the same time be death by incineration for a human.
Needless to say, the rise of military autonomics appalls the United Nations, human rights activists and assorted ethicists -- religious, legal and military -- as to their real-world implications. What do the laws of war and human ethics mean in battlefields where machines have agency? There are no good answers, and the U.N.'s Convention on Certain Conventional Weapons struggled to come up with any. But the fact remains that well over 40 nations have serious efforts underway to bring autonomic weapons systems to the seas, skies and land.
As technologies improve, military doctrines evolve and legal regimes struggle to catch up, the inescapable fact is that the real battle has shifted from what military machines can do to how independently their users want them to think. Debates about lethality, specificity and the calibration of force as determinants of effectiveness have given way to arguments about agency.
In other words, will smarter machines with greater discretion make nation states and their people safer and more secure? Does the rise of military autonomics deter and dissuade? Or will it invite military mischief and political aggression by states and terrorists alike?
Perhaps a next-generation machine might be clever enough to answer that.
My answer: Brilliant drones, either as individuals or in swarms, will amplify the strengths and weaknesses of their creators. The world will see nonstate actors using improvised extraordinary drones as weapons sooner than it would like. Commercial "off-the-shelf" drones are not that difficult to weaponize. As for nation-states, they are embarking on a new era in which drones and cyber warfare will increasingly merge.
Michael Schrage is a research fellow with the Massachusetts Institute of Technology Sloan School's Initiative on the Digital Economy. He is also an author of "The Innovator's Hypothesis" (MIT Press, 2014).