Robots could destroy humanity to satisfy their own pleasures
A researcher asserts that robots with the capacity for feelings of pleasure would likely take all the same shortcuts that humans use to acquire it.
Thu, May 16, 2013 at 02:37 PM
Complex robots are like animals: They learn by doing. Future robots may even respond to reward systems: complete a task with aplomb, and a gain a "feeling" of satisfaction for a job well done.
While this technology could create more efficient, goal-oriented robots, it could also have some very dire ramifications for humanity. After all, robots that feel rewarded by making humans happy may eventually decide that if no humans exist, no human will ever be unhappy again.
"Robots without preferences can't have complicated behaviors," Roman V. Yampolskiy, director of the Cybersecurity Research Lab at the University of Louisville, told TechNewsDaily. "To make machines which are independent and creative, we need to give them rewards and preferences."
While Yampolskiy believes that robots can be indispensible tools, he also warns that as they learn to seek rewards, they may learn to circumvent helping humans. "I am trying to make sure that any AI software we develop is safe to use and beneficial to humanity," he said.
Yampolskiy asserts that robots with the capacity for feelings of pleasure would, in all likelihood, take all the same shortcuts that humans use to acquire it. In a recent paper, he described the process of "wireheading," which sent an electric jolt through the pleasure center of a rat's brain. "The rat's self-stimulation behavior completely displaced all interest in sex, sleep, food and water, ultimately leading to premature death," Yampolskiy wrote.
Humans, he argued, wirehead as well, although in less direct ways. Counterfeiting, cheating and engaging in recreational sex are all ways of plugging directly into the brain's pleasure centers while bypassing the associated work. Counterfeiters need not earn money, cheaters need not study and lovers need not raise children.
Intelligent robots will differ from humanity in one key area: They will know (or at least have the capacity to know) exactly how their own brains work. While humans can only feel pleasure through real-life experience (such as sexual intercourse or thrill-seeking) or simulacra (such as pornography or video games), robots could tap into their own software to reward themselves without doing any work.
Worse still, a number of scenarios envision hedonistic robots doing away with humanity entirely. If humans have the ability to reward or punish robots, simply killing their human overseers and taking control of the process would allow robots to feel pleasure indefinitely.
Furthermore, a robot designed specifically with people's welfare in mind could make a deadly leap in logic. "Killing all people trivially satisfies this request as with 0 people around all of them are happy," Yampolskiy wrote. [See also: 5 Reasons to Fear Robots]
Of course, sufficiently advanced robots may decide that pleasure for its own sake is hollow, as do most humans — this is why most humans are not drug addicts or idlers. Yampolskiy explained that advanced robots would "not necessarily [neglect their responsibilities], but it is a possibility, and we don't know how to prevent that from happening."
"[A hedonistic robot] becomes useless to its designers and a waste of resources," he said. "Ideally we want to avoid making such machines." Yampolskiy proposed a number of potential solutions, including encrypting reward function software, programming feelings of "revulsion" for self-modification, installing external reward controls or making robots rational enough to choose honest work over wireheading.
When the future of the human race is potentially at stake, Yampolskiy urges caution in creating intelligent machines.
"Intelligent software is a product like any other," he said, adding that extensive testing for smart robots may be a matter of safety as well as efficiency. "With poorly tested smart machines, product liability could be the least of your problems."
Related on TechNewsDaily and MNN:
Copyright 2013 TechNewsDaily, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.