Human rights experts, activists push for ban on ‘killer robots’
“Campaign to Stop Killer Robots.” That may sound like a clique of conspiracy theorists or the title of a summer B movie, but it’s actually an alliance of human rights groups raising legal and ethical concerns about people’s willingness to cede life-and-death decisions to computers.
Who is responsible if an armed robot fails to distinguish between civilians and combatants when unleashing lethal force against a target that meets its programmed criteria?
And how, skeptics wonder, can a “fully autonomous weapon” be taught to recognize soldiers attempting to surrender or those already wounded and no longer a threat?
If national military forces can rely on machines to take on the front-line hazards of armed combat, will that reduced risk of human casualties remove an important deterrent to waging war?
The Campaign to Stop Killer Robots was joined Thursday by a diverse array of peace advocates and diplomats at a session of the U.N. Human Rights Council in calling for reflection on the wisdom of creating lethal technology that operates without human oversight -- and agreed rules for its use.
“Their deployment may be unacceptable because no adequate system of legal accountability can be devised and because robots should not have the power of life and death over human beings,” the United Nations’ watchdog on extrajudicial killings, Christof Heyns, told the council.
In calling for U.N. member nations to freeze development of robotic weapons “while the genie is still in the bottle,” Heyns warned of the risk of rapidly advancing technology outpacing political and moral consideration of unintended consequences.
In a 22-page report submitted to the U.N. rights forum, Heyns detailed the precursors to “fully autonomous weapons” already in operation:
-- Soldier-robots patrol the demilitarized zone between North and South Korea, and though remotely commanded by humans now, the programmed sentinels from Samsung Techwin are equipped with an automatic option.
-- The U.S. Navy launched an unmanned jet this month, the X-47B stealth drone developed by Northrop Grumman. Like generations of aerial drones that came before it, the X-47B is being billed as a surveillance tool. But it also has the capacity to carry more than 4,000 pounds of munitions.
-- Israel’s Harpy combat drone is designed to detect, attack and destroy radar emitters and suppress enemy air defenses.
-- Britain’s BAE Systems has developed its Taranis superdrone, which can autonomously search, locate and identify enemy targets. The device requires human authorization to fire, but it has the technological capability of determining on its own when to attack or respond.
Existing drone technology has stirred plenty of controversy and frustrated relations between the United States, its foremost developer and user, and countries like Pakistan, Afghanistan and Yemen, where airstrikes and targeted killings have inflicted “collateral damage,” the military euphemism for civilian casualties.
Getting the international community united on ground rules for fully autonomous weapons is likely to pose at least as much challenge as balancing the pros and cons of using drones, but one that legal experts contend isn’t beyond the realm of possibility.
There is already significant recognition among the technologically advanced countries that there should be limits to the degree to which computerized systems can take action without human involvement, said Bonnie Docherty, a Harvard Law School lecturer and senior instructor at its International Human Rights Clinic. The rights clinic co-wrote a report with Human Rights Watch late last year on the hazards of leaving battlefield decisions to machines, “Losing Humanity: The Case Against Killer Robots.”
Docherty pointed to the Pentagon’s November directive that fully autonomous weapons would be banned for the foreseeable future except to apply non-lethal or non-physical force, such as some forms of electronic attack.
Steve Goose, arms division director at Human Rights Watch, told journalists covering the U.N. meeting in Geneva this week that several governments have expressed willingness to take the lead in getting a global moratorium on lethal robotics in place.
The burgeoning alliance against “killer robots” is hopeful that world leaders can be brought together on the need for keeping humans in control.
“There is a good chance of success because we are trying to act preemptively, to prevent states from investing so much in this technology that they don’t want to give it up,” said Docherty.
M. Ryan Calo, a University of Washington law professor with expertise in robotics and data security, notes that there are upsides to robotic warfare, like the speed at which computers can make decisions and their ability to approach problem-solving in ways that are beyond humans.
“There’s a reason why 75% of trading is now by high-speed algorithms,” Calo said of the sales of stocks and commodities. “But humans tend to disproportionately trust the recommendations of computers.”
James Cavallaro, a law professor and director of Stanford’s International Human Rights and Conflict Resolution Clinic, disputes the notion that technology moves too fast to be bridled by the often glacial pace of international treaty drafting and ratification.
“People say that weaponized drones are already in use, that the cat’s out of the bag. But in the First World War, chemical and biological weapons were used and it wasn’t until afterwards that the international community decided they were too dangerous and banned them,” Cavallaro said.
The same power of retroactive evaluation has brought about a global covenant renouncing the use of land mines, which have inflicted horrendous civilian casualties in conflict-ravaged regions worldwide, he added.
“It’s a progression, in terms of weapons systems that reduce human engagement,” Cavallaro said. “And obviously robotics and weaponized drone aircraft that can make decisions about what is a danger are the final step on a very dangerous continuum.”
ALSO:
Syria has received advanced missiles from Russia, Assad says
Global Voices: Channeling ‘comfort women’ amid new controversyFather of Tsarnaev friend slain by FBI says his son was ‘executed’
A foreign correspondent for 25 years, Carol J. Williams traveled to and reported from more than 80 countries in Europe, Asia, the Middle East and Latin America.
More to Read
Sign up for Essential California
The most important California stories and recommendations in your inbox every morning.
You may occasionally receive promotional content from the Los Angeles Times.