Previous Chapter
Contents
Next Section

§ 1. Phreaks & Hacks
In our serious dealings with the world, we generate and possess more information than ever. But reality itself gets ever more buried under all the information we have about it. In consumption we are getting so adjusted to the light fare of more or less virtual experiences and emotions that the reality of persons and things seems heavily offensive and crude. 1
By the late 1970s and early 80s it had become quite clear computers had moved well beyond the realm of narrow business applications, government use and universities when their information processing abilities began to be applied in libraries, schools and homes, particularly in North America. Modem units, while not standard issue with most home computers, had dropped enough in price for many businesses and organizations to use them for transfer and exchange of information among distant branches. One of the first serious outgrowths of this development was the birth of the hacker community in parts of the United States and Europe, composed predominately of computer-savvy and well-educated young men who could exploit the intricacies of phone and computer telecommunications systems to gain access to corporate or government information stored in distant computer banks.

There had been widespread controversy even in the mid-1960s in the US about the security of increasingly interwoven information systems which brought the debate finally into the realm of government by way of the Congressional Sub-Committee on Invasion of Privacy being called to solicit scientific and commercial expertise on the subject. The Data Encryption Standard (DES) was a 40-bit key-length encryption scheme (later bumped to 56 bit) which had won favor with many law enforcement groups and the intelligence community because it was seen as 'strong' enough to deter pranksters and individual hackers, yet was still 'weak' enough to be decrypted by the computing power which government agencies could bring to bear if necessary. Many experts in the field of computing, usually connected with universities and tending toward distrust of Big Government (particularly after the Watergate abuses) felt this level of encryption to be woefully inadequate. They said as much in the hearings, and when their scholarly opinions were ignored (and the DES was adopted) they returned to their universities and set themselves to the problem. 2 Commercial vendors were a good deal more enthusiastic about the possibilities offered by computing and skeptical of 'information abuse', leading one powerful executive, Dr. Harry C. Jordan (founder of Credit Data Corporation) to conclude confidently that the emerging digital environment created far "more opportunity for control than it does for hazard." 3 This statement both ominous in its generality (whose control exactly) and ironic insofar as Credit Data Corp. was soon afterwards purchased by TRW, Inc. only to be rocked by the glaring insecurity of its systems at the hands of Christopher Boyce (who then wrote The Falcon and the Snowman ) who was able to pass on its satellite specs to the Soviets in the 1970s and later in the early 1980s as the hacker Kevin Mitnick routinely invaded their corporate databanks. 4

In 1975, the mathematicians Whitfield Diffie and Martin Hellman, published a paper on a particular encryption algorithm which was widely appraised as being more efficient and stronger than the new Federal DES system. In 1977, computer programmers Ron Rivest, Adi Shamir and Leonard Adleman extrapolated this idea into a shared public key encryption scheme and working software dubbed RSA (the initials of its developers). By the 1990s, the RSA encryption engine would be distributed to over 300 million computers world-wide, embedded into the software of Microsoft, Netscape and Novell, as well as the hardware of IBM machines and Intel processing chips.

However, while these developments were still underway, most computer systems around the world were still being structured and maintained along open-access architectures. Just how open they were would soon become apparent. By 1979, there were well over a thousand public computer Bulletin Board Systems in operation around the US and (like most data resources then available) they often centered around the minutiae of computer maintenance and software exchange. However, in March of 1980 a Canadian technician working in Silicon Valley at Digital Equipment established something quite new on his own. Using a Digital PDP-8 minicomputer, plugged into the wall of his apartment's spare bedroom, Bernard Klatt brought his own public system, 8BBS, on-line and it became the first public knowledge-exchange for hackers and 'phone phreaks'. Very soon it was being dialed into from across the country and the system caught the attention of a young LA teen named Kevin Mitnick and a California hacker named Roscoe. The two met on the BBS, then in the real world, and from that time on and for the next 15 years Mitnick would sharpen his talents by infiltrating computer systems at University of Southern California, TRW- Credit Data, NASA, Visa, Digital Equipment Computers and Bell Telephone until his final arrest and conviction in 1995. 5

These and other widely publicized 'hacks', by individuals like Mitnick or groups with names like the 'Komputer Kaos Klub,' or the 'Legion of Doom' made headlines in the 1980s and led to a series of crackdowns (the US Federal Bureau of Investigation's 1990 "Operation Sun Devil" was the most notorious and contested) throughout the United States and the rest of the world as business leaders, politicians and legal authorities slowly became aware of the potential for data theft, corporate espionage and system damage. In '89, a computer science grad student, Robert T. Morris, disabled 6,000 host computers linked throughout the world when he unleashed the first 'worm' virus. AT&T had its entire network in California disable for days in 1990 by a hacker calling himself 'Acid Phreak'. Many more cases (especially in banking and credit industries) were routinely kept out of public view in order to maintain consumer confidence.

This is the gravest threat to the computer industry and its corporate networks generated by hackers, viruses and bugs. The loss of money, data, or the expense of down-time are all secondary (these and other disasters, after all, are why insurance is still an endlessly profitable industry ). 6 The primary problem is when consumers and users begin to waver in their trust and comfort, when they begin to experience unreliability or notice anomalies, when, they begin to realize the system they are presented with is not hack-proof, bug-proof, fool-proof, or in other words, not reality proof. Any programmer will explain the horror they feel upon being told that an actual end user, one from the real world, will need to be able to interact with a system. The machine world, with its machine logic, is terribly fragile in the face of real-world interaction- a horrible truth of which most programmers are painfully aware:
I'd like to think that computers are neutral, a tool like any other, a hammer that can be used to build a house or smash a skull. But there is something in the system itself, in the formal logic of programs and data, that recreates the world in its own image...forms an irresistible horizontal country that obliterates the long slow, old cultures of place and custom, law and social life. We think we are creating a system for our own purposes. We believe we are making it in our own image...but the computer is not really like us. It is a projection of a very slim part of ourselves: that portion devoted to logic, order, rule and clarity. It is as if we took the game of chess and declared it the highest order of human existence. 7
However, its important to understand the reasoning behind hacker culture and its approach to questions of information, especially at the point where agents of the government are suddenly conducting broad search and seizure investigations. 8 The early architecture of computer networks in this era was widely used for free exchange of information and software by people who cared deeply about what they were producing. It was pride, not profit motive, which fostered this sensibility. One of the cornerstones of the 'hacker ethic' is the oft-repeated phrase, 'information wants to be free,' and in its own small way even McGill University had a direct hand in fostering this ethic through the technologies it developed at the time:
Network size requires innovative solutions for resource discovery. However, that same size facilitates those innovative solutions. Consider Archie, which collects indexes of files available from anonymous FTP (file transfer protocol) servers...written by a few people at McGill University. Archie logged 120,000 uses in the first seven months of 1991. Most of its usage from 5 PM to midnight EST was from Europe...thus a few people in one county can affect usage patterns for the whole network by providing a service people all over the world want. 9
This sensibility of unrestricted open access and exchange has numerous roots, from both inside and outside the computer community. First, most early software development and networking was sponsored or took place in university settings, where peer-review has traditionally been considered a vital element of successful testing and reputation building. Therefore, from the outset, most of the material circulated within these circles was considered 'shareware', where experts in the field would distribute and disseminate the materials they had produced for others to comment upon, modify and improve as they felt best suited. The idea of proprietal information in these systems often seemed detrimental to furthering a better working application or design (even an operating system like DOS, initially developed by dozens of people working in far-flung institutions, was eventually made a commercial item by Microsoft ). 10 Put simply, the notion of 'mine' and 'yours' gets in the way of producing, through cooperation, a better system. The second development came precisely when large companies began to closely guard their software source code. The idea of being able to patent strings of ones and zeros, or to restrict access to certain types of data, struck many early computer developers as running contrary to the earlier ideal. 11
Notes:
1 Albert Borgman, Holding on to Reality : the Nature of Information at the Turn of the Millennium (Chicago: University Press, 1999), 218.

2 In 1991, Phil Zimmermann began distributing his PGP (Pretty Good Encryption) software on an anonymous FTP server site after he grew increasingly concerned the US Congress might make 'strong' encryption completely illegal to use or own for reasons of national security. Zimmermann was soon after threatened with jail time as the FBI brought charges against him under US export laws, and the court case dragged on several years before he was exonerated. By that time, PGP had become a widely-acknowledged benchmark for data communications security. Hundreds of encryption algorithm schemes exist but only a few have been efficiently incorporated into working software. As a side note on the Congressional findings, by 1994, a self-proclaimed crypto-geek and math student named Matt Blaze cracked a DES encrypted message (in a weekend of computation) forcing the US government to acknowledge vulnerability to even modest attacks.* The proposed substitute for DES, called the Clipper Chip, was similarly attacked and cracked, this time by a Canadian grad student at Berkeley using 259 stations in the computer faculty working in parallel. This time it took three and a half hours.
* Once again, Jetifi has cleared up another technical aspect: in Steven Levy's Crypto : how the code rebels beat the government : saving privacy in the digital age (NY: Viking, 2001) the collaborative relationship between IBM & the NSA is shown to be very cozy. Levy, incidentally, also wrote Hackers : heroes of the computer revolution (Anchor Press/Doubleday, 1984), an excellent early survey of 70s & 80s computer subculture. There's also Bruce Sterling's classic The Hacker Crackdown.
3 Simon Garfinkel, Database Nation: The Death of Privacy in the 21st Century (Cambridge: O'Reilly, 2000), 23.

4 Katie Hafner and John Markoff, Cyberpunk: Outlaws and Hackers on the Computer Frontier (New York: Simon and Schuster, 1991), 68, and Alan F. Westin, Databanks in a Free Society: Computers, Record-Keeping and Privacy (NY: Quadrangle Books, 1972), p. 134-138.
Important note of clarification by blaff : DES is a conventional (secret key) encryption algorithm, whereas DH is not an encryption algorithm at all, but rather the first ever published key exchange algorithm. It has to be used in connection with another convential algorithm, like DES or triple-DES. By contrast, RSA is a public-key encryption algorithm (not just for key-exchange) but because it is so slow, it is USED as a key-exchange algorithm. Also, randombit points out DES was made a standard in 1976, and therefore predates my chapter title. A very good point.
5 Mitnick was a prodigious and self-taught UNIX hacker who dived through the dumpsters of companies whose computers he planned to infiltrate, lived off fast food and pulled all-night marathon computer sessions. He was brought before California courts on numerous occasions but was never jailed until his arrest in 1989 for stealing the source code of DEC's new operating system (and using hacked computer storage-space at Berkeley to bury his treasure). After he was released on probation, his probation officer's phones were re-routed, the judges who presided over his sentencing had their credit cards cancelled and all electronic records of Mitnick's criminal career vanished. He went underground and was not caught until 1995 and was jailed until 1999- his release back into society is contingent on his complete and total removal from any computerized device beyond a calculator. Mitnick, according to the conditions of his parole, is not even permitted to use a telephone. See article such as Wired's "Did Sun Inflate Mitnick Damages?" (http://www.wired.com/news/politics/0,1283,19820,00.html) or "How Much Damage Did Mitnick Do?" (http://www.wired.com/news/politics/0,1283,19488,00.html) for more recent developments to this saga.

6 Paul Pimentel, "Insurers see interest in cyber-policies: as e-business increases, companies vulnerable to viruses, hackers," Globe and Mail, May 10, 2000, sec. Business Focus, B11.

7 Ellen Ullman, Close to the Machine (San Francisco: City Lights, 1997), 89.

8 One of the more influential non-profit Internet organizations in this area is now the EFF, formed just after 150 FBI agents executed search warrants on the homes of suspected computer hackers around the United States in Operation Sun Devil. The group was kick-started and funded in the beginning by the ex-founder of the Lotus software company. The Electronic Frontier Foundation also includes noted cyber-enthusiast John Perry Barlow and the group helps raise awareness about civil rights, privacy issues and legal repercussions of a networked world.

9 John S. Quarterman, "Telecomputing in the New Global Networks," Communications in History (1995): p. 347.

10 MS-DOS sold for $60 in 1981 versus IBM's version which was selling for four times that price. See Joel Shurkin, Engines of the Mind: The Evolution of the Computer from Mainframes to Microprocessors (New York: W.W. Norton, 1996), 315.

11 Now the most widespread manifestation of this counter-movement away from proprietary information is the notion of 'copyleft', established by the international GNU Project, which builds tools for Linux. This, in turn, is the creation of Linus Torvalds, a Finish programmer who took the original backbone of the UNIX operating system developed by Bell Labs in the 1970s and transformed it into Linux which is available in dozens of different 'distros' (or distribution versions). Copyleft essentially requires all users and developers to take what they want from the public source code, but they must also leave copies of what they make of the code or develop from it for others. This is the ethos behind an entire movement in software called OpenSource. By 1995, one particular Linux application, a server system called Apache had become the most popular in the world (it was both incredibly stable and totally free), which in turn forced IBM to begin product support of the Linux version in 1998. Since that time, the Linux O/S has continued to gain ground, and may be largely responsible for Microsoft's move to exit the operating system market to focus on applications. Microsoft, by some estimates, could lose as much as 20% of their annual sales to open source alternatives by 2004. See David Akin, "Open source movement grows: study," National Post, Financial Post section, August 28, 3.

Previous Chapter
Contents
Next Section