Expert Opinion

UN meeting urge to ban killer robots

I am today at the UN in Geneva as a member of the Threshold group on Unconventional Weapons, working with the International committee for Robot Arms Control (ICRAC) and other NGOs such as Human Rights Watch and Amnesty International to outlaw robot weapons systems which by themselves make decisions to kill humans
We are at a meeting of the Certain Conventional Weapons (CCW) meeting which reviews the legality of weapons within the frameworks of International Humanitarian Law and the Geneva conventions on proportionality, discrimination and unnecessary suffering – the so called Marten’s clauses.

It is a significant gathering in that we are endeavouring to ban weapons which do not yet exist. Why? Well we are not against autonomous robotic systems per se, since they can bring many future benefits. But we do see many concerns about weapons which operate “without meaningful human control.”

In a memorandum to country delegates, ICRAC outlined ten problems for global security. Namely:-

1.      Proliferation

When a weapon creates any military advantage nations will rush to acquire it. Without an international muzzle on the development, testing, and production of Lethal Autonomous Weapons Systems (LAWS), we are likely to see mass proliferation of these weapons and counter weapons. Not all nations will have the ability to carry out weapons reviews of LAWS required under international law. It is not difficult to create LAWS when there is insufficient consideration for IHL.

2.  Lowered threshold for armed conflicts

LAWS could lead to more action short of warfare by minimising human military forces in conflict zones. This could enable states to initiate the use of violent force without the consultation procedures required to deploy troops on the ground. By removing these inhibitors of violent action, some could be seduced into more armed conflicts – at the expense of civilian populations.

3.  Continuous global battlefield

One reason for the development of LAWS is the persistence of unmanned vehicles. Some current remotely piloted aerial vehicles have been employed to monitor and attack targets over long time periods. Without the requirement for a remote operator to maintain vigilance, LAWS could be left behind – like landmines – to patrol post-conflict zones and could thus create a continuous global battlefield. The result could have devastating psycho-social consequences.

4.    Unpredictability of interaction

As more countries employ LAWS and autonomous counter defences, these weapons as well as command and control systems will inevitably interact. When any mobile device controlled by software programs interacts with a competing hostile device controlled by unknown software, the result of the interaction is scientifically impossible to predict.

5.    Accelerating the pace of battle

One of the most common reasons given to support the use of LAWS is that the pace of battle is increasing to a point where human decision-making is not fast enough. New prototypes in the unmanned systems domain are increasingly being tested at supersonic and hypersonic speeds. This will require even faster autonomous response devices that in turn will require ever-faster weapons. It is not hard to see that such a ‘pace race’ will equate to humans having little control over the battle-space.

6.    Accidental conflict

If the development and proliferation of LAWS is allowed to continue, supersonic or hypersonic (defence) systems of one state could interact with equally fast LAWS from another state. The speed of their unpredictable interaction could trigger unintended armed conflicts before humans had the opportunity to react.

7.    Militarisation of the civilian world

We are already seeing the use of new unmanned war technologies in civilian settings. Law enforcement and border control agencies are using unmanned systems for surveillance. Some companies are even arming them with Tasers, pepper sprays and other so-called ‘less than lethal’ ammunition. With autonomous targeting technology this could lead to violations of human and civil rights by police and private security forces with little possibility of accountability.

8.    Automated oppression

LAWS would be an attractive tool for the oppression of populations and the suppression of peaceful protest and political change. While soldiers can in principle refuse to turn their weapons on their own people, LAWS will kill mercilessly on the basis of their coded instructions.

9.  Non-state actors

We are currently witnessing an unprecedented diffusion of technology. The cost of robotics development is falling, with the required off-the-shelf hardware now widely available. If autonomous weapons development is allowed to continue it will not be long before we see crude copies or grey market exports in the hands of non-state armed actors.

10. Cyber vulnerability

Humans need to be in control of weapon systems to counter many of the potential dangers with entirely computerised and autonomous weapons. The risks of software coding errors, malfunctions, degradation of communications, and especially enemy cyber-attacks, infiltrations into the industrial supply chain, jamming, and spoofing make LAWS inherently insecure.


We are at a critical juncture in the evolution of weapons. The end point of increasing weapons’ automation is full autonomy, where human beings have little control over the course of conflicts and events in battle. At this point in time, it is still within our power to stop the automation of the kill decision, by ensuring that every weapon remains meaningfully controlled by humans.

Both humans and computer systems have their strengths and weaknesses, and the aim of designing effective supervisory systems for weapons control must be to exploit the strengths of both. This way, it is possible not only to gain better legal compliance, but also to ensure that the partnership between human and machine best ensures the protection of civilians, their human dignity and our wider global security.

The meetings which began yesterday garnered significant international support from country delegations. Our discussions have already raised public awareness of these issues yielding headlines across the world.  Ongoing news stories are highlighting the problems not just for military escalation but also for autonomous weapons being used for oppression, for riot and border management, if kill decisions are made by non-human agencies. (Including The Guardian; New Scientist and the BBC.)

Today we will be discussing what would happen if autonomous killing capacity fell into the hands of non-governmental actors such as ISIS. Will they herald a new era of assassination as a government and non-government service? Such machines are being seen as a potentially destabilising factor in global security. They are not inevitable because of such discussions.  But no one is in doubt of the vested interests of the burgeoning military, security, university, police and media-entertainment complex in making money out of trying to technically fix future social, political and environmental crises.

Dr Steve Wright

Dr Steve Wright is a former Reader in the School of Applied Global Ethics. He studied for his BSc in Liberal studies in science at the University of Manchester. His PhD thesis at Lancaster University was on New Police Technologies & Sub-State Conflict Control.