A device used for computing; specifically, an electronic machine which performs rapid, often complex calculations or compiles, correlates, and selects data in inference-like operations which are defined over a set of representational structures (information) and result from the manipulation of these structures in accordance with a fixed set of rules (instructions).

See also: digital computer, analog computer
A Civilization advance.
A computer is a device capable of performing a series of arithmetic or logical operations for a specific function. Computers run set routines, called programs, that combine these operations into complex tasks. Significant advances in computer technology take place at an amazing pace.
Prerequisites: Electronics and Mathematics.
Allows for: Space Flight and Robotics.

Computer is a song by Bob Mintzer. This is a great song that I played at PMEA Jazz Band. Most of it is a funk rock song, comparable to Chameleon as played by Maynard Ferguson and composed by Herbie Hancock. In the middle and end, the instrumentalists come in, one section at a time, playing a simple little melody. Eventually, the entire band is playing individual simple melodies, and the effect is like a computer analyzing many things at once.

A person who computes;
who calculates.
Cool and calculating, he concentrates on
the computer screen.
Concerned, he combs through his code,
his cerebral cortex confounded.
It’s certain, the clunky
chaotic
mass
will not be completed tonight.

But then the chap comes up with something
clever. “A-ha! Eureka! Woot!”
His fingers cut through the keys as his consciousness kisses the conundrum
good-bye.



Node your homework

Computers
Or
From steam power to quantum tunneling in 150 years


A history and description of computers is immediately faced with a task of determining what truly constitutes a computer. A basic definition from typical web sources spits out:

com.put.er Pronunciation Key (km-pytr) n.

1. A device that computes, especially a programmable electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.
2. One who computes.
This node concerns itself with definition 1, a broad category of machinery: mechanical, electrical, whatever. While most modern computers are purely electrical devices, the history of computers is steeped in mechanical contraptions of ever-increasing complexity. A full history would be excessive here, and is well documented in many places over the web. A brief synopsis, therefore, of that long history is presented here:
  1. ~3000 BCE: The abacus is invented, apparently in Babylonia. Simple to construct, but allows addition and multiplication to be done with relatively high speed.
  2. 1623 CE: William Schickard develops a calculation clock to help with multiplying large numbers, the first true mechanical calculating machine. No copies of the machine remain today.
  3. 1801 CE: Joseph-Marie Jacquard constructs an automatic loom, using punch cards (making them the longest enduring computer data storage format), a precursor to today's robotic assembly lines. The loom could create intricate woven patterns based on sequences of punch cards. Riots later erupt over such machinery, blamed for replacing people with machines.
  4. 1823 CE: Charles Babbage is given a grant by the English government to develop a full Difference Engine, a steam-powered polynomial evaluator. Babbage planned to use the Engine to recalculate critical mathematical and navigational tables. Ada Byron, Countess of Lovelace, corresponded with Babbage over the design constantly. Ten years later, Babbage conceived of the Analytic Engine, a true modern computer. It was to have a memory store, an execution unit, and it would operate on a formula (in modern usage, computer programs), and it would be able to calculate any expression. Countess Lovelace documented the design throughly, and became the first known programmer when she developed formulas for the planned machine. Unfortunately, the design was far too ambitious for the technology of the era, and an Analytic Engine was never built. In the end, neither was the Difference Engine, the half-built machine now residing in the Science Museum in London. Babbage is considered to be the inventor, if not the implementor, of the true computer.
  5. 1854 CE: George Boole publishes "An Investigation of the Laws of Thought", developing Boolean logic, a system of logical algebra. Boole created the algebra to evaluate the truth of logical propositions; his system is now the basis of every digital computer.
  6. 1939-1944 CE: Konrad Zuse, a German engineer, completes the Z2, a machine using electromechanical relays and boolean logic, but otherwise very similar in basic design to Babbage's Analytic Engine. Conscripted into the army, he lead a team in designing the Z3, which had a 64-number memory, which each number being 22 binary digits (bits) long. The machine could perform a multiplication in 5 seconds. The Z3 was finished in 1941, and was the first-ever working general purpose programmable computer. Unfortunately, it was destroyed in an air raid in Berlin in 1945. Meanwhile, research in the United States followed similar lines: Howard Aiken constructed the Mark 1, using electromechanical relays, with speeds similar to the Z machines. John Atanasoff and Clifford Berry designed, and partially constructed a fully electronic machine based on eletronic valves, but the Atanasoff-Berry Computer (or ABC) was never completed.
  7. 1943 CE: The Colossus is constructed to help break the German Enigma code. Built by a group in Bletchley Park, England which included Alan Turing who became one of the pioneers of Computer Science, the Colossus can be considered the first completely electronic computer, even though it could only perform the specialized task of breaking Enigma codes.
  8. 1944 CE: The ENIAC is completed under a U.S. Government grant. Designed by J. Presper Eckert and John Mauchly, it was the first working fully electronic general-purpose computer. It was also quite large, measuring 100 by 10 by 3 feet, and weighed roughly 30 tons. It could perform a multiplication in less than 3 microseconds.
  9. 1954 CE: The TRADIC becomes the first computer to use transistors as a replacement for the valves, setting the stage for the rapid growth in computer complexity predicted by Moore's Law.

Starting in the 50's, computers have grown in complexity in both software and hardware with enourmous speed. Today, a hand-held calculator has more power than the computers that powered the Apollo modules. However, while computing systems have become more and more complex, they are still based on the same principles that Babbage and Boole described.

All modern computers share these traits, listed below:

  • Binary Operation: All computers of any complexity have been based on Boolean logic, where there are only two values: True and False, easily represented in electronic circuitry as two different voltages. A base-two system is therefore a natural platform for a computer built on switches, operating with Boolean logic.
  • A Memory: The memory is used to store information that the computer is processing, or to archive data that might be accessed later. Computer memory has ranged from hard-wired switches, to punch cards, drum memory, bubble memory, floppy disks, hard drives, optical disks, solid-state memory, and experimental media such as holographs and quantum dots. They allow the storage of arbitrary binary numbers, some with easy rewritability, some without. Capacities, access speeds, and portability vary widely across the different formats.
  • Execution Units: These take binary values from the memory, and operate on them, returning the result into the memory. In modern systems, several execution units are placed on the each CPU chip. The units are often specialized, some working with integer values, others with real numbers represented in binary. Recent advances include SIMD units (such as the Altivec), pipelined execution, and many other features to increase speed.
  • A Program: A program is a sequence of instructions for the computer, telling it what to do with the data stored in its memory. At the lowest level, a program is encoded in machine code, direct binary values that each encode a single action for the computer, such as reading a value from memory, or adding two values together. As computers have evolved in complexity, high-level programming languages (such as C, Ada, and LISP) have been created to simplify the creation of complex programs. These languages hide the lower level details from the programmer, using compilers to automatically convert abstract program code into the computer's native machine code. For small, embedded computer systems, programmers sometimes still work in Assembly languages, which use simple english mnemonics for each machine code instruction.
  • A Control Unit: The control unit takes in the program, reads in the machine code values (or bytes), and then operates the memory and execution units accordingly. The control is often a state machine of varying complexity. Separating the control and the execution units makes desiging computers much easier, since the control only needs to tell the execution units 'do this', and the execution units do not need to deal with the details of the machine code interpretation. The control and execution units, and sometimes limited amounts of memory, are typically on a single silicon chip, known as the Central Processing Unit, or CPU.
  • A Clock: Almost all modern computers run off very precise quartz crystal clocks, with speeds currently measured in Megahertz and recently Gigahertz. The clock makes the sequencing of operations in a computer simpler, guaranteeing valid results from one stage of computation to the next. However, synchronous designs, as clocked digital computers are called, are not the only modern method. Asynchronous VLSI techniques are also being investigated.
  • A User Interface: The ability to compute would be of little use if there was no way to input new data, or read out the results. User interfaces for computers have also evolved a great deal over the last fifty years, from blinking rows of lights showing memory values, to modern monitors, keyboards, and mice. Additionally, computers are now typically connected to others through computer networks (such as the Internet, of course), allowing for widespread information exchange.

A modern computer has a multitude of layers of software and hardware between the user and the actual transistor gates which implement the computer. Let's take a quick plunge through a simple action, and see how it passes through the system:

A user has opened the Windows calculator, and has so far entered in '5','+','3' with his keyboard. They now press the '=' key.

  1. An electrical switch closes under the equals key on the keyboard, causing a brief surge of current as an eletrical wire changes its voltage potential.
  2. The keyboard's small processor, scanning through each row of keys hundreds of times a second, detects the change in voltage levels in its input wires. It decodes the key based on the row and column pressed. Data is sent to main computer describing the event
  3. The computer's interrupt controller, a system for picking up and handling external events, receives the alert from the keyboard. Assuming no higher-priority tasks are executing in the computer, the controller issues an interrupt to the control unit, telling it to immediately switch tasks and pay attention to the keyboard.
  4. The control unit consults an internal interrupt vector table, all stored in nearby memory cells, and issues commands to fetch the next program bytes from the memory location listed for a keyboard interrupt.
  5. The interrupt routine, a small piece of independent code, is loaded into the control unit, which begins its execution. The code fetches the keypress information, and places it in a queue data structure in memory for the rest of the operating system. The routine then ends, and the control unit switches back to whatever it was doing before the interrupt.
  6. Shortly, in machine time, and practically instantaneously in human time, Windows executes a basic, low-level routine which checks for new user inputs. Seeing the keypress, the routine determines which executing program was the recipient, and places a keypress event on that program's event queue. All these actions take hundreds of steps in machine code each.
  7. The Windows scheduler, the master controller of which program is allowed to use processing time, passes control to the calculator program.
  8. The calculator program's main loop sees the new keypress event, and reads it in. Its internal program logic determines the significance of pressing the equals key, and it begins to process the actual calculation. It may be interrupted at any point by the scheduler to allow more critical routines to execute (such as interrupt routines).
  9. In C-code, the actual calculation might look like this:
    ... case EQUALS: result = op1 + op2; break; ...
  10. The calculator program, having been compiled into machine code long before, runs through the computer's control unit step by step. Finally, the calculation itself is reached. The computer must transfer two values from the main memory into an execution unit, which then performs the addition. The result must then be moved back into memory. In assembly language, the code might look like this:
    MOV DX, 00014B24
    MOV AX, @DX
    MOV DX, 00014B28
    MOV BX, @DX
    ADD AX, BX
    MOV DX, 00014B2C
    MOV @DX, AX
    which, to an assembly programmer, is not hard to read. To most people, it is complete gibberish. The long numbers preceded by zeros are the memory locations of the values used in the calculation, represented in hexadecimal, or base-16. The MOV command instructs the computer to move a piece of data from one place to another. The ADD instruction actually performs the addition.
  11. The machine code equivalent for the ADD for an Intel 8086-compatible processor would be 0000 0011 11 000 011, or in hexadecimal notation, 03C3. The control unit in the CPU reads in this value, and interprets it as an addition of two values currently stored in some of the CPU's registers. The control unit sends out signals that transfer the values in the registers to the execution unit, and then signals to the unit to begin addition.
  12. The execution unit contains an adder circuit, a fairly standard piece of electronics. It routes the two values to the inputs of the adder, and waits a clock phase or two.
  13. Inside the adder circuit, the binary digits, all low/high voltage signals, race through transistors and wires. A simple adder circuit has the same circuit per pair of digits, with two single-digit inputs, a carry in, a carry out, and the sum output. The outputs (sum, carry out) is determined based on the inputs (a, b, carry in) according to the following table:
    A   B    Carry In  |  Carry Out  Sum   
    0   0        0    ->     0        0
    0 0 1 -> 0 1
    0 1 0 -> 0 1
    0 1 1 -> 1 0
    1 0 0 -> 0 1
    1 0 1 -> 1 0
    1 1 0 -> 1 0
    1 1 1 -> 1 1

    Using Boolean logic, the above can be described as
    Sum = ((A XOR B) XOR Carry In)
    Carry Out = ((A AND B) OR (A AND Carry In) OR (B AND Carry In))
    The circuitry can be implemented in several ways, depending on the underlying technology.
  14. At the lowest level, we have a single transistor, one of typically millions on a single CPU. Most digital logic is done with the CMOS process, which uses MOSFET transistors. Here, a P-MOS transistor that is currently on receives a high voltage to its base contact. The gate voltage causes the conducting channel between the source and drain to close off, switching the transistor off. A short while later, the voltage drops again, turning the transistor back on. This switching eats up a tiny bit of current each time. Multiplied millionfold, the result is a modern CPU, requiring constant active cooling to keep the chip from frying.
  15. Now the result of the calculation percolates upward through the layers of hardware and software, until finally, the user sees some glowing phosphors on their screen, displaying an '8'. Quite a bit of work for a simple result.

It should be clear from the above that even a simple task in a modern computer is quite complex below the surface, requiring hundreds of actions to all work together in unison. A single failure anywhere along the path, and the system breaks. Sometimes it surprising that computers work as well as they do.

Computers are used for millions of applications; they are in everything from toothbrushes to million-dollar factory machinery. And ever-increasingly, they are connected together, allowing information to flow in unprecedented ways. It's hard to say where it'll all lead.

Modern research on computing focuses on several fields. First, many researchers are actively trying to shrink the size of transistors on silicon chips smaller and smaller. Commercial companies are currently trying for 0.09 micron technology, where the smallest feature that can be created is 0.09 microns wide. A second field of research is quantum computers, where computations are performed in ways radically different than current digital systems. Computer science research is pushing forward on dozens of fronts, making computers more complex, more powerful, and hopefully more useful daily.

A hundred years ago, computing machines were piles of gears and sprockets, and looked like complex typewriters. Now, they utilize results quantum mechanics (for example, Flash ROM) in their operation, and perform billions of calculations per second.





Corrections, nitpicks, etc, are welcome
Created for the Content Rescue Quest

Sources:
www.dictionary.com
www.hofstra.edu/Academics/HCLAS/CSC/ComputingHistory/CompHist_timeline.cfm
www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html
developer.intel.com/design/pentium/manuals/24319101.pdf

Com*put"er (?), n.

One who computes.

<-- a machine which computes -->

 

© Webster 1913.

Log in or register to write something here or to contact authors.