Is it OK to torture or murder a robot?

Richard Fisher is the deputy editor of BBC Future.

We form such strong emotional bonds with machines that people can’t be cruel to them even though they know they are not alive. So should robots have rights?

Kate Darling likes to ask you to do terrible things to cute robots. At a workshop she organised this year, Darling asked people to play with a Pleo robot, a child’s toy dinosaur. The soft green Pleo has trusting eyes and affectionate movements. When you take one out of the box, it acts like a helpless newborn puppy – it can’t walk and you have to teach it about the world.

Yet after an hour allowing people to tickle and cuddle these loveable dinosaurs, Darling turned executioner. She gave the participants knives, hatchets and other weapons, and ordered them to torture and dismember their toys. What happened next “was much more dramatic than we ever anticipated,” she says.

For Darling, a researcher at Massachusetts Institute of Technology, our reaction to robot cruelty is important because a new wave of machines is forcing us to reconsider our relationship with them. When Darling described her Pleo experiment in a talk in Boston this month, she made the case that mistreating certain kinds of robots could soon become unacceptable in the eyes of society. She even believes that we may need a set of “robot rights”. If so, in what circumstance would it be OK to torture or murder a robot? And what would it take to make you think twice before being cruel to a machine?

Until recently, the idea of robot rights had been left to the realms of science fiction. Perhaps that’s because the real machines surrounding us have been relatively unsophisticated. Nobody feels bad about chucking away a toaster or a remote-control toy car. Yet the arrival of social robots changes that. They display autonomous behaviour, show intent and embody familiar forms like pets or humanoids, says Darling. In other words, they act as if they are alive. It triggers our emotions, and often we can’t help it.

For example, in a small experiment conducted for the radio show Radiolab in 2011, Freedom Baird of MIT asked children to hold upside down a Barbie doll, a hamster and a Furby robot for as long as they felt comfortable. While the children held the doll upside down until their arms got tired, they soon stopped torturing the wriggling hamster, and after a little while, the Furby too. They were old enough to know the Furby was a toy, but couldn’t stand the way it was programmed to cry and say “Me scared”.

It’s not just kids that form surprising bonds with these bundles of wires and circuits. Some people give names to their Roomba vacuum cleaners, says Darling. And soldiers honour their robots with “medals” or hold funerals for them. She cites one particularly striking example of a military robot that was designed to defuse landmines by stepping on them. In a test, the explosions ripped off most of the robot’s legs, and yet the crippled machine continued to limp along. Watching the robot struggle, the colonel in charge called off the test because it was “inhumane”, according to the Washington Post.

Killer instinct

Some researchers are converging on the idea that if a robot looks like it is alive, with its own mind, the tiniest of simulated cues forces us to feel empathy with machines, even though we know they are artificial.

Earlier this year, researchers from the University of Duisburg-Essen in Germany used an fMRI scanner and devices that measure skin conductance to track people’s reactions to a video of somebody torturing a Pleo dinosaur – choking it, putting it inside a plastic bag or striking it. The physiological and emotional responses they measured were much stronger than expected, despite being aware they were watching a robot.

Read More: http://www.bbc.com/future/story/20131127-would-you-murder-a-robot/all



Categories: Artificial Neural Network (ANN), Biotechnology ( New ), Transhuman

Tags: , , , , , , ,