|Table of contents|
The defining feature of a "universal computer" is programmability, which allows the computer to emulate any other calculating machine by changing a stored sequence of instructions. In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability.
In 183 Charles Babbage described his analytical engine. It was the plan of a general-purpose programmable computer, employing punch cards for input and a steam engine for power. While the plans were probably correct, disputes with the artisan who built parts, and the end of government funding, made it impossible to build. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by L. F. Menabrea. She has become closely associated with Babbage. Some claim she is the world's first computer programmer, however this claim and the value of her other contributions are disputed by many. The Difference Engine II has been built and is operational at the London Science Museum; it works as Babbage designed it and has disproven the theory that Babbage was incapable of manufacturing parts of the required precision.
In 1890 the United States Census Bureau used punch cards and sorting machines designed by Herman Hollerith to handle the flood of data from the decennial census mandated by the Constitution. Hollerith's company eventually became the core of IBM.
In the twentieth century, electricity was first used for calculating and sorting machines. Earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors. Before World War II, mechanical and electrical analog computers were the 'state of the art', and many thought they were the future of computing. Analog computers use continuously varying amounts of physical quantities, such as voltages or currents, or the rotational speed of shafts, to represent the quantities being processed. An ingenious example of such a machine was the Water integrator built in 1936. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems while the earliest attempts at digital computers were quite limited. But as digital computers have become faster and used larger memory (e.g., RAM or internal store), they have almost entirely displaced analog computers.
The era of modern computing began with a flurry of development before and during World War II, as electronic circuits, vacuum tubes, capacitors, and relays replaced mechanical equivalents and digital calculations replaced analog calculations. The computers designed and constructed then have been called 'first generation' computers. First generation computers were usually built by hand using circuits containing relays or vacuum valves (tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Temporary, or working storage, was provided by acoustic delay lines (which use the propagation time of sound in a medium such as wire to store data) or by Williams tubes (which use the ability of a television picture tube to store and retrieve data). By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s. A example of a practical WWII era machine was the Target Data Computer employed in American submarines, that allowed the operator to input a few pieces of data, such as the sub's speed and heading, and some observed variables about a target vessel. The TDC would then calculate and display the exact aiming point for firing torpedoes. The TDC was a part of what led to total American dominance in submarine warfare in the Pacific.
This era saw numerous electromechanical calculating devices of various capabilities which had a limited impact on later designs. In 1938 Konrad Zuse started construction of the first Z-series, electromechanical calculators featuring memory and limited programmability. Zuse was (inadequately) supported by the German Wehrmacht which used his proto-computers for the production of guided missiles. The Z-series pioneered many advances, such as the use of binary arithmetic and floating point numbers. In 1940, the Complex Number Calculator, a calculator for complex arithmetic based on relays, was completed. It was the first machine ever used remotely over a phone line. In 1938 John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed the Atanasoff Berry Computer (ABC), a special purpose computer for solving systems of linear equations, and which employed capacitors fixed in a mechanically rotating drum, for memory.
During World War II, the British made significant efforts at Bletchley Park to break German military communications. The main German cypher system (the Enigma in several variants) was attacked with the help of purpose built 'Bombes' which helped find possible Enigma keys after other techniques had narrowed down the possibilities. The Germans also developed a series of cypher systems (called Fish cyphers by the British) which were quite different than the Enigma. As part of an attack against these cyphers, Professor Max Newman and his colleagues (including Alan Turing) designed Colossus.
Colossus was the first programmable (to some extent) electronic computer. Since solid state electronics had yet to be invented, it used vacuum tubes, had a paper-tape input and allowed some programmability. It was built and used to decrypt German wartime cyphers. Ten second-generation Colossus machines were built (there were at least two variants), but details of their existence, design, and use were kept secret well into the 1970s. Winston Churchill is said to have personally issued an order for their destruction into pieces no larger than a man's hand. Due to this secrecy Colossus was not included in many histories of computing. There is an active project to build a copy of one of the Colossus machines.
Turing's pre-War work was a major influence on the design of the modern computer, and after the War he went on to design, build and program some of the earliest computers at the National Physical Laboratory and at the University of Manchester. His 1936 paper in Mind included a description of what is now called the Turing machine, a purely theoretical device invented to formalize the notion of algorithm execution. Modern computers are Turing-complete (i.e., equivalent algorithm execution capability to a universal Turing machine), except for their finite memory. Turing completeness is a threshold capability separating general-purpose computers from their special-purpose predecessors. It is as good a criterion as any for defining "the first computer", but unfortunately even with this restriction there is no simple answer as to which computer was the first. Babbage's Analytical Engine was the first design of a Turing-complete machine, Zuse's Z3 was the first Turing-complete working machine (but this was unknown to Zuse and was proved only in 1998 after his death), and the electronic ENIAC was the first working Turing-complete computer designed and used as such. The ABC machine was not programmable, though a complete computer in the modern sense in most other respects. George Steibitz and colleagues at Bell Labs in NY City produced several relay based 'computers' in the late '30s and early '40s, but were concerned mostly with problems of telephone system control, not computing. Their efforts were a clear antecedent for another electromechanical American machine, however.
The Harvard Mark I (officially, the Automatic Sequence Controlled Calculator) was a general purpose electro-mechanical computer built with IBM financing and with assistance from some IBM personnel under the direction of Harvard mathematician Howard Aiken. Its design was influenced by the Analytical Engine. It used storage wheels and rotary switches in addition to electromagnetic relays, was programmable by punched paper tape, and contained several calculators working in parallel. Later models contained several paper tape readers and the machine could switch between readers based on a condition. Nevertheless, this does not quite make the machine Turing-complete. Development began in 1939 at IBM's Endicott laboratories; the Mark I was moved to Harvard University to begin operation in May 1944.
The US-built ENIAC (Electronic Numerical Integrator and Computer), the first large-scale general-purpose electronic computer, publicly validated the use of electronics for large-scale computing. This was crucial for the development of modern computing, initially because of the enormous speed advantage, but ultimately because of the potential for miniaturization. Built under the direction of John Mauchly and J. Presper Eckert, it was 1,000 times faster than its contemporaries.
Its development and construction lasted from 1941 to full operation at the end of 1945. When its design was proposed, many researchers believed that the thousands of delicate valves (ie, vacuum tubes) would burn out often enough that the ENIAC would be so frequently down for repairs as to be useless. It was, however, capable of 5,000 simple calculations a second for hours at a time between valve failures. It was programmable, not only by rewiring as originally designed, but later also with fixed wiring executing stored programs set in function table memory using a scheme named after John von Neumann.
By the time the ENIAC was successfully operational, the plans for the EDVAC were already in place. Insights from experience with ENIAC led to the EDVAC design, which had unrivalled influence in the initial stage of the computer revolution. The design team was led by von Neumann.
The essentials of the EDVAC design have come to be known as the von Neumann architecture: programs are stored in the same memory 'space' as the data. Unlike the ENIAC, which used parallel processing, it used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. The EDVAC design can be seen as the "Eve" from which nearly all current computers derive their architecture.
The first working von Neumann machine was the Manchester "Baby", built at the University of Manchester in 1948; it was followed in 1949 by the Manchester Mark I computer which functioned as a complete system using the Williams tube for memory. This University machine became the prototype for the Ferranti Mark I, the world's first commercially available computer (although some point out that LEO I was the computer that was used for the world's first regular routine office computer job in November 1951). The first model was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
Later in 1951, the UNIVAC I (Universal Automatic Computer), delivered to the U.S. Census Bureau, was the first commercial computer to attract U.S. public attention. Although manufactured by Remington Rand, the machine often was mistakenly referred to as the "IBM UNIVAC". Remington Rand eventually sold 46 machines at more than $1 million each. UNIVAC was the first 'mass produced' computer; all predecessors had been 'one-off' units. It used 5,200 vacuum tubes and consumed 125 kW of power. It used a mercury delay line capable of storing 1,000 72-bit words for memory. Unlike earlier machines it did not use a punch card system but a metal tape input.
The next major step in the history of computing was the invention of the transistor in 1947. This replaced the fragile and power hungry valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still large and primarily used by universities, governments, and large corporations. For example the IBM 650 of 1954 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 or could be leased for $3,500 a month.
In 1955, Maurice Wilkes invented microprogramming, now almost universally used in the implementation of CPU designs. The CPU instruction set is defined by a type of programming.
In 1956, IBM sold its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control). It used 50 24-inch metal disks, with 100 tracks per side. It could store 5 megabytes of data and cost $10,000 per megabyte.
The first high-level general purpose programming language, FORTRAN, was also being developed at IBM around this time.
In 1959 IBM shipped the transistor-based IBM 1401 mainframe, which used punch cards. It proved a popular general purpose computer and 12,000 were shipped, making it the most successful machine in computer history. It used a magnetic core memory of 4000 characters (later expanded to 16,000 bytes). Many aspects of its design were based on the desire to replace punched card machines which were in wide use from the 1920s through the early 70s.
In 1960 IBM shipped the transistor-based IBM 1620 mainframe, which also used punch cards. It proved a popular scientific computer and about 2,000 were shipped. It used a magnetic core memory of up to 60,000 decimal digits.
In 1964 IBM announced the S/360 series, which was the first family of computers that could run the same software at different combinations of speed, capacity and price. It also pioneered the commercial use of microprograms, and an extended instruction set designed for processing many types of data, not just arithmetic. In addition, it unified IBM's product line, which prior to that time had included both a "commercial" product line and a separate "scientific" line. The software provided with System/360 also included major advances, including commercially available multi-programming, new programming languages, and independence of programs from input/output devices. Over 14,000 System/360 systems were shipped by 1968.
Also in 1964, DEC launched the PDP-8 much smaller machine intended for use by technical staff in laboratories and for research.
The explosion in the use of computers began with 'Third Generation' computers. These relied on Jack St. Claire Kilby's invention of the integrated circuit (or microchip), which later led to Robert Noyce's invention of the microprocessor.
Due to the length of this article, third and subsequent generations are discussed in History of computing II.