Ban killer Robocops before it's too late, rights groups say

Autonomous robots would lack proper judgment to use lethal force, according to Human Rights Watch

Serve the public trust, protect the innocent, uphold the law.

Those were RoboCop's directives in the 1987 film about a cyborg police officer who would shoot bad guys while quipping "Your move, creep."

But as science fiction inches closer to fact in the 21st century, rights groups are warning that armed police robots will threaten human rights instead of protect them.

Governments must impose a preemptive ban on fully autonomous weapons before it's too late, even though they don't yet exist, Human Rights Watch (HRW) and Harvard Law School said in a report released Monday.

The 26-page Shaking the Foundations: The Human Rights Implications of Killer Robots examines the implications of lethal autonomous robots, which would have the power to decide when to take a human life. It follows discussion about the use of such machines in warfare.

The document speculates that law-enforcement agencies could use killer robots in fighting crime and controlling riots, while governments could deploy them against political opponents and terrorists.

Robots would not be able to replicate human judgment and compassion in critical situations, it says, nor defuse a potentially deadly situation. Specifically, machines would be unable to decide what amount of force is necessary, when it constitutes a last resort and how to apply it in a proportionate manner.

"We found these weapons could violate the most basic human rights -- the right to life, the right to a remedy and the principle of dignity," HRW researcher and Harvard Law School lecturer Bonnie Docherty wrote in an email.

"These rights are the basis for all others."

Fully autonomous weapons could select and fire on targets without meaningful human intervention, she said, adding they are a step beyond existing drones.

Military drones have come under increasing scrutiny for their role in U.S. attacks. According to the UK-based non-profit organization The Bureau of Investigative Journalism, U.S. drone strikes killed more than 2,400 people in Pakistan, Yemen and Somalia over a five-year period under President Barack Obama.

Meanwhile, the notion of fully autonomous machines that can exercise lethal force has drawn increased attention. Writing in The Independent earlier this month about progress in artificial intelligence research, cosmologist Stephen Hawking and colleagues warned, "We are facing potentially the best or worst thing to happen to humanity in history."

Last year, HRW launched its Campaign to Stop Killer Robots, which ramped up debate on the technology. Christof Heyns, U.N. special rapporteur on extrajudicial, summary or arbitrary executions, urged a global moratorium on the development of lethal autonomous robots.

On Tuesday, dozens of countries will spend four days discussing such robots at a disarmament conference associated with the Convention on Conventional Weapons, HRW said, adding the U.N. Human Rights Council will take up the topic in June.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags regulationlegislationroboticsGovernment use of ITHuman Rights Watch

More about Human Rights WatchKiller

Show Comments
[]