GENEVA: The international community must impose a moratorium on robot weapons which can make their own decision to kill, a UN expert told the world body’s top human rights forum Thursday, warning that they could enable war crimes to go unpunished.
“Human rights requires that human beings should in one way or another retain meaningful control over weapons of war,” Christof Heyns said in a debate at the UN Human Rights Council on lethal autonomous robots, or LARs.
As a result, nations should “declare and implement national moratoria on the production, assembly, transfer, acquisition, deployment and use of LARs until a framework on the future of LARs has been established”.
“Their deployment may be unacceptable because no adequate system of legal accountability can be devised,” said Heyns, the UN’s special rapporteur on extrajudicial, summary or arbitrary executions.
“Do we want a world in which we can be killed either as combatants or as collateral damage by robots with an algorithm which takes the decision?”
“War without reflection is mechanical slaughter,” he warned.
Such weapons are a step up from drones, unmanned aircraft whose human controllers push the trigger from a far-distant base, over which controversy rages due to civilian casualties when the United States strikes alleged militants.
Drones targeting suspected al Qaeda and Taliban militants in Pakistan are estimated to have killed up to 3,587 people since 2004, including as many as 884 civilians, sowing deep anger there.
US President Barack Obama has announced new drone criteria, with lethal force only to be used when capture is not feasible and the target poses an imminent threat.
Pakistan’s council delegate Mariam Aftab called for tough measures against next-generation weapons, beyond a moratorium.
“The international community should consider a complete ban on LARs,” she said.
“We believe the experience with drones shows that once such weapons are in use, it is impossible to stop them,” she added.
Heyns said the “genie was out of the bottle” for drones, and governments should seize the chance to set preemptive global rules for LARs.
Supporters of LARs say they offer life-saving potential in warfare, being able to get closer than troops to assess a threat properly.
Heyns, a leading South African human rights lawyer, said it was important to consider that aspect.
“But it’s one thing to get information, it’s another to interpret it correctly,” he told reporters after his speech, insisting the problem comes when LARs decide to strike.
“It’s this issue of diminishing human responsibility that concerns me,” he explained.
No state uses fully autonomous weapons that could be classified as LARs, but the technology is already available, Heyns noted.
The United States and Britain lead the field.
“We agree that lethal autonomous weapons may present important policy and ethical issues, and we call on all states to proceed in a lawful, prudent and responsible manner when considering weather to incorporate automated and autonomous capabilities in weapons systems,” said US delegate Stephen Townley.
Last November, Washington imposed a 10-year human-control requirement, which Heyns commended.
Britain’s council delegate said existing legal provisions were sufficient.
But Heyns argued that unclear rules raised the risk of breaches of the laws of war.
“These include questions of whether robots will make it easier for states to go to war, and the extent to which they can be programmed to comply with the requirements of international humanitarian law, especially distinction and proportionality,” he said.
“LARs can potentially be also used by repressive governments to suppress internal domestic opponents,” he warned.
Activists also want to stop them.