A term, coined by
Rudy Rucker in 1988's
Wetware, for an
artificial intelligence that has been constrained in some way to serve human interests. Examples of such constraints include an
AI limited by
Isaac Asimov's
three laws of robotics, which would not necessarily result in its acting for human interests, only not against them.
HAL 9000's famous
murder spree was a result of placing mission objectives above human interests - an Asimov would instead adhere at least to a (sometimes-crippling) philosophy such as that espoused by the android
Bishop in the movie
Aliens -
"Impossible for me to harm or, by omission of action, allow to be harmed a human being. (smiling) More cornbread?"
- return to the Transhumanist Terminology metanode