ArrowArtboardCreated with Sketch.Title ChevronTitle ChevronIcon FacebookIcon LinkedinIcon Mail ContactPath LayerIcon MailPositive ArrowIcon PrintIcon Twitter

AI arms race risks rise of uncontrollable killer robots

US, China and Russia poised to field unmanned weapons before global rules take shape

A Chinese military vehicle carries a Pterodactyl I aerial drone at a parade in Beijing. Parents' reluctance to send their only children into combat has the military searching for less dangerous options.   © Reuters

TOKYO -- Drones swarming like bees to attack an aircraft carrier, or taking flight after a missile strike to show soldiers the quickest way to fix a destroyed runway, are no longer the stuff of science fiction. They are in the plans of the U.S., China and Russia as they race to develop weapons controlled by artificial intelligence.

But with the pace of progress in military AI systems now outstripping discussion of international rules to govern them, these transformative weapons are likely to make their way onto the battlefield with little oversight.

Unmanned systems are crucial to China, where the decadeslong one-child policy has left the country with many only children whose parents have no desire to send them into combat. The People's Liberation Army is working on turning old tanks due for decommissioning into remote-controlled robots, and with AI, these would have no need for human involvement at all.

This technology also appeals to Japan's Self-Defense Forces, which are struggling to recruit from a graying population.

But the more sophisticated AI-controlled weapons are, the greater the fallout could be. At a United Nations meeting on the topic this past March, Japan called for international restrictions on lethal autonomous weapons systems, which are controlled completely by AI and can use deadly force without human intervention.

A Russian Uran-9 remote-controlled tank. Moscow has joined the rush for autonomous weapons systems.   © Reuters

If designed or used without sufficient care, autonomous weapons risk becoming so-called killer robots with little regard for the humanitarian principles on which the laws of war are based. Militaries or governments could invest too much confidence in the capabilities of AI systems, making them more willing to start conflicts or prone to overlook opportunities to end them.

The potential perils of AI weapons have prompted protests by civic groups. Companies working with the military on AI projects, such as Google, have faced pushback from employees.

Former Google CEO Eric Schmidt has predicted that humans will be capable of controlling runaway killer robots. And some in the U.S. military worry that unless they make human involvement a condition for using AI weapons, they may have trouble securing cooperation from engineers in the private sector, where most AI research is now taking place.

But fears about killer robots remain unabated as Washington, Beijing and Moscow compete for supremacy in the field.

If, for instance, the U.S. mandated human involvement in its AI weapons, out of concern for public opinion, this could give China a decisive edge in decision-making speed, if it opted for fully autonomous systems. This possibility could tempt Washington to leave all the decisions to AI in its own systems as well.

Triton drones developed by U.S. defense contractor Northrop Grumman.   © Reuters

The use of AI in warfare is often described as a major breakthrough on a par with the development of nuclear weapons. But some observers -- including Henry Kissinger, who served as U.S. secretary of state in the 1970s during the Cold War between the U.S. and Russia -- worry that AI weapons may prove even harder to control.

For all their destructive force, nuclear arms cannot improve their own capabilities. Some fear that autonomous systems could eventually modify and enhance themselves without outside input, or build copies of themselves with the same capabilities.

And in contrast to nuclear missiles, which require large, conspicuous facilities to produce on a large scale, AI weapons can be developed by small teams that are far easier to conceal.

The natural human desire for greater safety and fear of letting the enemy get ahead drive the development of new weapons. This gave the world nuclear weapons, and it is poised to usher in new technology that could prove even more problematic.

Atomic bombs have been used only twice in the 74 years since their creation because those two occasions showed the world the horror of nuclear weapons. A similar shared understanding of the threat posed by AI weapons, and the difficulty of controlling them, would represent a first step toward keeping them in check. This technology will test the wisdom and foresight of humanity like never before.

Sponsored Content

About Sponsored Content This content was commissioned by Nikkei's Global Business Bureau.

You have {{numberArticlesLeft}} free article{{numberArticlesLeft-plural}} left this monthThis is your last free article this month

Stay ahead with our exclusives on Asia;
the most dynamic market in the world.

Stay ahead with our exclusives on Asia

Get trusted insights from experts within Asia itself.

Get trusted insights from experts
within Asia itself.

Try 1 month for $0.99

You have {{numberArticlesLeft}} free article{{numberArticlesLeft-plural}} left this month

This is your last free article this month

Stay ahead with our exclusives on Asia; the most
dynamic market in the world

Get trusted insights from experts
within Asia itself.

Try 3 months for $9

Offer ends July 31st

Your trial period has expired

You need a subscription to...

  • Read all stories with unlimited access
  • Use our mobile and tablet apps
See all offers and subscribe

Your full access to Nikkei Asia has expired

You need a subscription to:

  • Read all stories with unlimited access
  • Use our mobile and tablet apps
See all offers
NAR on print phone, device, and tablet media

Nikkei Asian Review, now known as Nikkei Asia, will be the voice of the Asian Century.

Celebrate our next chapter
Free access for everyone - Sep. 30

Find out more