It seems like a question ripped from the back of a cheap sci-fi novel: What happens when the robots are turned against us?
But researchers at the University of Washington think it's finally time to start paying some serious attention to the question of robot security. Not because they think robots are about to go all Terminator on us, but because the robots can already be used to spy on us and vandalize our homes.
Robots have emerged as popular consumer devices over the past few years -- primarily as toys, but also as household chore robots such as iRobot's Roomba vacuuming machine.
In a paper published Thursday the researchers took a close look at three test robots: the Erector Spykee, and WowWee's RoboSapien and Rovio. They found that security is pretty much an afterthought in the current crop of robotic devices.
"We were shocked at how easy it was to actually compromise some of these robots," said Tadayoshi Kohno, a University of Washington assistant professor, who co-authored the paper.
The researchers aren't so much worried about the scenario depicted in James Cameron's movie Terminator, where machines develop self-awareness and decide to wipe out humanity. They're afraid of a world where hackers can take control of the robots we've brought into our homes.
Some of today's robots operate as wireless access points, and Kohno's team found that a nearby attacker could connect to someone else's robot quite easily. Robots such as the Rovio can also be controlled over the Internet, meaning that if a hacker could somehow sniff the victim's user name and password, he could turn the robot into a remote-controlled spy machine.
"We think that consumers should at least be aware that there is the possibility that someone would listen in on their robot and take over their robot and have mobile eyes and ears in their home," said Tamara Denning a PhD student who also worked on the paper. "They're little computers."
The University of Washington team says that as more sophisticated robots come online -- especially future generations of powerful household robots -- they could be misused in ways that their designers have not foreseen.
In their paper, they discuss ideas such as "robot vandalism" -- even weak robots can push something fragile down a flight of stairs -- and "robot suicide." Robots could be used to eavesdrop on conversations or frighten small children too, the researchers said.
The attacks that they can actually pull off may sound more creepy than scary, but Kohno said that robot makers will serve their customers best by thinking of these issues from the start, rather than having to patch machines after they get compromised. "Let's think about security and privacy as one of the initial design goals," he said.