Semantically, one rich source of jargon constructions is the hackish
tendency to anthropomorphize hardware and software. English purists
and academic computer scientists frequently look down on others for
anthropomorphizing hardware and software, considering this sort of
behavior to be characteristic of naive misunderstanding. But most
hackers anthropomorphize freely, frequently describing program
behavior in terms of wants and desires.
Thus it is common to hear hardware or software talked about as though
it has homunculi talking to each other inside it, with intentions and
desires. Thus, one hears "The protocol handler got confused", or
that programs "are trying" to do things, or one may say of a routine
that "its goal in life is to X". Or: "You can't run those two
cards on the same bus; they fight over interrupt 9."
One even hears explanations like "... and its poor little brain
couldn't understand X, and it died." Sometimes modelling things this
way actually seems to make them easier to understand, perhaps because
it's instinctively natural to think of anything with a really complex
behavioral repertoire as `like a person' rather than `like a thing'.
At first glance, to anyone who understands how these programs actually
work, this seems like an absurdity. As hackers are among the people
who know best how these phenomena work, it seems odd that they would
use language that seems to ascribe conciousness to them. The
mind-set behind this tendency thus demands examination.
The key to understanding this kind of usage is that it isn't done in a
naive way; hackers don't personalize their stuff in the sense of
feeling empathy with it, nor do they mystically believe that the
things they work on every day are `alive'. To the contrary: hackers
who anthropomorphize are expressing not a vitalistic view of program
behavior but a mechanistic view of human behavior.
Almost all hackers subscribe to the mechanistic, materialistic
ontology of science (this is in practice true even of most of the
minority with contrary religious theories). In this view, people
are biological machines - consciousness is an interesting and
valuable epiphenomenon, but mind is implemented in machinery which
is not fundamentally different in information-processing capacity
Hackers tend to take this a step further and argue that the difference
between a substrate of CHON atoms and water and a substrate of silicon
and metal is a relatively unimportant one; what matters, what makes a
thing `alive', is information and richness of pattern. This is animism
from the flip side; it implies that humans and computers and dolphins
and rocks are all machines exhibiting a continuum of modes of
`consciousness' according to their information-processing capacity.
Because hackers accept that a human machine can have intentions, it
is therefore easy for them to ascribe consciousness and intention to
complex patterned systems such as computers. If consciousness is
mechanical, it is neither more or less absurd to say that "The
program wants to go into an infinite loop" than it is to say that "I
want to go eat some chocolate" - and even defensible to say that
"The stone, once dropped, wants to move towards the center of the
This viewpoint has respectable company in academic philosophy. Daniel
Dennett organizes explanations of behavior using three stances: the
"physical stance" (thing-to-be-explained as a physical object), the
"design stance" (thing-to-be-explained as an artifact), and the
"intentional stance" (thing-to-be-explained as an agent with desires
and intentions). Which stances are appropriate is a matter not of
truth but of utility. Hackers typically view simple programs from the
design stance, but more complex ones are often modelled using the
It has also been argued that the anthropomorphization of software and
hardware reflects a blurring of the boundary between the programmer
and his artifacts - the human qualities belong to the programmer and
the code merely expresses these qualities as his/her proxy. On this view,
a hacker saying a piece of code 'got confused' is really saying that
he (or she) was confused about exactly what he wanted the
computer to do, the code naturally incorporated this confusion, and
the code expressed the programmer's confusion when executed by
crashing or otherwise misbehaving.
Note that by displacing from "I got confused" to "It got confused",
the programmer is not avoiding responsibility, but rather getting some
analytical distance in order to be able to consider the bug
Both explanations accurately model hacker psychology, and should be
considered complementary rather than competing.
--The Jargon File version 4.3.1, ed. ESR, autonoded by rescdsk.