The computer industry began with monstrously large computers, with the first generations taking up the entire space of a large room. Nowadays, a personal computer many times as powerful can fit onto a desktop. This scaling down of computers over the past fifty years took place in three distinct phases, beginning with the room-sized mainframes of the 1950s, leading to the minicomputers of the late 1960s and early 1970s, and finishing with the advent of the personal computer in the late 1970s. Each time, there were two changes in the manufacturing process that allowed the new paradigm, and these were decreased manufacturing costs and improved, miniaturized technology. These factors, as well as pressures from consumers and industry, changed the computer and software industries to the ones we know today.

The first major transition was from mainframe computers to the smaller yet still powerful minicomputers. In order to understand this transition, it is important to know what differentiates a mainframe from a minicomputer. Mainframes were enormous machines. This was due to the fact that they used technology that took of a lot of space. At first, the machines used huge vacuum tubes as their logic technology. After the invention of the transistor, the machines used these instead. However, the machines remained large, because the new computers traded the space saved by eliminating valves for additional circuits to improve performance.

The general pattern of use for these mainframes was for the machine to be minded and operated by a trained staff, who would run programs for others and return the results. This usually meant that most users never ended up interacting directly with the machines. Since computer time was so much more expensive than human time, programming was all done offline. Programs were designed, written, and carefully examined before the first run was ever performed. Having to run a corrected program over again because of an error was a waste of expensive resources.

With mainframe computers, the larger the machine was, the better it performed. This trend was commented upon by industry pundit Herb Grosch, who stated that, "the performance of a computer varies as the square of its price." This assertion was known as Grosch's law, and it encouraged the purchase of large supercomputer to be time-shared among a group of users, since that would provide more performance per price than would buying smaller computers to be used by each of them.

This led to the idea of the computer utility, by which computation would be performed at a central plant and sold on demand, rather than having onsite computers. For a while, it seemed like this might catch on, and so by 1967 there were twenty firms in the business of providing computing services. However, by 1971, the bottom had dropped out of the market, and the majority of the firms were foundering. This was the partly a result of software crises among many of the providers of computer utility. As computer utilities became more powerful, it became harder to write the software. IBM experienced this with their time-sharing System/360 operating system. GE experienced the same problem with the Multics operating system. Lessons learned during these projects, which were some of the first major software engineering projects conducted, were important to later efforts to develop complicated software.

These software failures were important in the downfall of the computer utility and the central mainframe model of computing, but the drastic drop in the cost of electronics from 1965 to 1970 played an important role. This price reduction came about because of the invention of integrated circuits. These also provided a terrific size decrease, so that a computer that would have filled a room in 1965 now took up the space of a table. It also cost ten times less than its 1965 counterpart. These cost reductions destroyed Grosch's Law, and the market paradigm shifted from paying subscription fees to a centralized time-sharing service to buying a small time-sharing computer that would provide in-house computing. This became the standard practice among universities, businesses, and laboratories. These smaller time-sharing systems were known as minicomputers.

With minicomputers, the institutional attitudes surrounding computer use changed. While computer use had previously been restricted to sanctioned users only, the the minicomputers were so much less expensive than mainframes that computer time was at less of a premium. Therefore, more users got to interact directly with the machine. Also, since the cost of computer time had come far closer to the cost of human labor, users began to work interactively with the computers. This meant far less preparation was done before sitting down at the terminal with one's code. Computing time had become a less valuable resource, so using the system to work bugs out of programs was less of a waste. The ability of students to use computers in this hands-on fashion led to an increase in career computer scientists, as well as generating interest in computing as a hobby.

Software for minicomputers changed to some extent. Mainframe systems had each required their own proprietary operating system, and it was impossible to share applications between systems, or to upgrade the system without rewriting the software applications. The advent of Unix and C for several platforms made it possible to bring code from machine to machine. This is because Unix and C acted as a layer over the hardware level, keeping everything uniform to the programmer's point of view. This was a great advantage for commercial software developers, since it meant that the market for a given software could be any Unix-running machine, instead of just the one specific mainframe platform for which the application was written.

The business attitudes also changed with the development of minicomputers. Where computer companies in the 1960s tried to provide total solutions for their customers, the new generation of computer businesses focused on delivering working systems at minimal costs. An example of this is the DEC corporation, whose PDP-8 computer is considered the first minicomputer. Instead of trying to go after the business markets, DEC went after customers in the fields of science and engineering. These customers required less support than the business customers, needed less specialized hardware, and were often capable of writing their own software. This eliminated a lot of costs for DEC. In addition, the PDP-8 was the first computer to use integrated circuits, and this reduced the costs of manufacture even further. The low cost and high power of the DEC PDP-8 made it a huge commercial success.

The next step of evolution was the transition from the minicomputer to the microcomputer. Once again, the driving force behind this transition was a technological advance that resulted in a substantial decrease in the computers cost and size. This development was the microprocessor, and the chips that led most directly the microcomputer revolution were all manufactured by Intel. However, instead of the pressure behind the change coming from industry, it originated with electronics hobbyists.

The microprocessor itself was developed by Intel engineer Ted Hoff, as a general purpose processing unit for a calculator that Intel had been commisioned to build. Its use as a "computer on a chip" wasn't immediately apparent to Intel. At some point, however, the company realized what they had on their hands, and after acquiring the rights to the 4004 chip, they began to market it as the first microprocessor. It was a later version of this chip, the Intel 8080, that became the heart of the MITS Altair Kit, the ancestor of the modern microcomputer.

The Altair was an machine with extremely limited capabilities, and it was shipped as parts, to be assembled by the customer. This meant that the only people who ordered the Altair were hobbyists skilled enough to assemble, solder, and test a complicated electronic device. This was a larger number than MITS's founder and president, Ed Roberts, had hoped for.

The Altair wasn't much use in performing computational tasks, but its existence as a privately affordable computer generated so much interest that it eventually did lead to the advent of the commercial microcomputer. Hobbyists interested in the Altair and building computers began to hold meetings to discuss their interests and share what they had done or built. The most famous of these is the Homebrew Computer Club, and it was these meetings that led Steve Jobs and Steve Wozniak to build the Apple Computer.

Based on the interest present at the Homebrew Meetings, Jobs believed that there was a commercial market for a new, better homemade computer. Wozniak had already designed the board that would be the heart of the first Apple Computer. Jobs convinced Wozniak to work with him to market the computer. Using Jobs' parents' garage as their workshop, he and Wozniak built and sold two hundred of the machines. After this success, Jobs managed to get venture capital, and their company put Wozniak's new design on the market under the name of Apple II. A company named Commodore released a personal computer near the same time that Apple II came to market, and Tandy released machines of its own later that year. IBM eventually released a personal computer as well, and the world had fully entered the microcomputing age.

The Apple II was the most important of the first generation commercial microcomputers. This is because it was the first personal computer to see a great deal of practical use, rather than being just a toy for hobbyists. The key piece of software that made the difference in the acceptance of microcomputers as a legitimate business tool was VisiCalc. VisiCalc was the first spreadsheet program, and it was a strong selling point for the Apple II to anyone working with number projections. VisiCalc automated the tedious task of calculating cascading projections by hand. Automating these calculations, in addition to saving labor, also eliminated human-introduced errors. It is most likely that it was the success that VisiCalc and its imitators had in the business market that attracted the interest of IBM. After IBM entered the market, corporate buyers familiar with dealing with IBM became confident to invest in microcomputer technology, and this investment secured the platform a lasting place.

After microcomputers really hit the market, institutional computing resources were suddenly just one way to get access to a computer. By this point, computer time had become cheap enough that cost no longer needed to be a consideration, and large institutions had the option of putting a machine on each desk. As it happened, this was less often the case than that of employees bringing their computers into work. This separation of computing resources resulted in what is called the "sneaker network," a term used to describe the transfer of data between departments being accomplished by carrying a disk from one's office to the department in question. This created efficiency and organizational problems that eventually helped computer networking be adopted. Computer use from this point on would employ as much direct interaction as desired, as human labor was now the most expensive cost in computing operations.

Microcomputers also created a whole new level of commercial software. With so many more computers in operation, the market became much larger. The microcomputer commercial software market began with Altair Basic, developed for the system by Bill Gates and Paul Allen of Microsoft, and sold to MITS on a royalty basis. Many Altair users made and shared copies of this program, as was the standard behavior for ideas and software between hobbyists. However, Microsoft wasn't interested in sharing their hard work for free, and Bill let the community know his stance on the subject with a letter to the group. Industry and consumers have both come to accept Gates' stance on the issue, and today computer software is as big a market as that for hardware.

The size of computers has changed over the past fifty or so years, from mainframes the size of a room to a microcomputer that takes up only part of a normal desktop. This scaling down took place in three separate phases, beginning with the reduction of mainframes to the table-sized minicomputers, and ending with the personal computer. Each time, changes in the manufacturing process had decreased manufacturing costs while and improving and miniaturizing technology. These factors, in combination with pressure from consumers and from new companies that took advantage of the improved technology, ended up changing the computer market to the current model, while changing attitudes towards computing and software development techniques along the way.

Log in or register to write something here or to contact authors.