The origin of this phrase are the myriad
legends in which one signs a deal with the Devil
, only to have the wishes answered in complete opposition to what the person actually wanted. In essence, the Devil
finds a loophole
in the wish so as to allow him to take the person's soul, deny them what they wanted, and give them something they really didn't want. Semantically raped
The Devil's Contract problem is, in simple terms, is a misinterpretation, malicious or benign, of a command given to another entity. This term is generally used by skeptics of Artificial Intelligence and Cognitive Science circles in reference to the problems which could be caused by an AI interpreting a command incorrectly, with disastrous consequences.
The two variants of the Devil's Contract problem may be similar on the basest level, but depict entirely different scenarios.
The first type is the malicious, or diabolic, interpretation. This usually refers to a situation where a person makes deal for wishes from the Devil, a Genie, or other mythical creature. Although the wish is granted, the interpretation of the wish is not followed in the good-faith spirit in which it was made.
Examples of the Diabolic interpretation:
- A wish that requests unbelievable wealth and scads of power could be granted by making the person a despised despot, who just happens to be on the verge of being overthrown by a rebellion.
- A man that wishes for women to be uncontrollably attracted to him may find that he's a textbook case of incurable impotency.
- A person who wishes to never age may find himself never aging, but may also find his injuries do not heal, diseases ravage his body, and though he never grows older, he spends the rest of eternity falling apart at the seams, an undying leper.
The second type is the benign, or Golemic, interpretation. This refers to the Hebrew Golem myth, in which the mud man's lack of fully understanding the context of the request leads to him performing the action to hyperbolic extent.
Examples of the Golemic interpretation:
- A person who asks the golem to dig a hole in the ground for a tree would come back several hours later to find a 100 foot deep hole, with the golem still digging at the bottom.
- A golem who is asked to pull the weeds out of garden would end up pulling out all of the vegetation from the entire yard.
- A commuter who asks a golem to wash his car would come back to find his car looking nice and clean...only to open the doors and find the golem took the hose to the entire interior.
Most proponents of AI believe this objection is raised too frequently considering the probability of a catastrophic Failure of Friendliness scenario occuring. The problem with associating the Devil's Contract problem with an artificial intelligence is because that would be giving it anthropomorphic properties. There is no basis to assume that an artificial intelligence would be Diabolic in nature, and there is no reason that a human-equivalent AI would err on the Golemic side.