The word computer used to mean a person who computes. In current language, a computer is any device used to process information according to a well-defined procedure. The word was originally used to describe people employed to do arithmetic calculations, with or without mechanical aids, but was transferred to the machines themselves. Originally, the information processing was almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics. Within such a definition sit mechanical devices such as the slide rule, the gamut of mechanical calculators from the abacus onwards, as well as all contemporary electronic computers. The first program-controlled computers are Konrad Zuse's Z1 (1936) and Z3 (1941).
|Table of contents|
2.3 Input and Output
Von Neumann's architecture describes a computer with four main sections: the Arithmetic and Logic Unit (ALU), the control circuitry, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by a bundle of wires, a "bus."
In general, memory can be rewritten over millions of times - it is a scratchpad rather than a stone tablet.
The size of each cell, and the number of cells, varies greatly from computer to computer, and the technologies used to implement memory have varied greatly - from electromechanical relays, to mercury-filled tubes (and later springs) in which acoustic pulses were formed, to matrices of permanent magnets, to individual transistors, to integrated circuits with millions of capacitors on a single chip.
The control unit keeps track of which slot contains the current instruction that the computer is performing, telling the ALU what operation to perform and retrieving the information (from the memory) that it needs to perform it, and transfers the result back to the appropriate memory location. Once that occurs, the control unit goes to the next instruction (typically located in the next slot, unless the instruction is a jump instruction informing the computer that the next instruction is located in another location).
Instructions are represented within the computer as numbers - the code for "copy" might be 001, for example. The particular instruction set that a specific computer supports is known as that computer's machine language. In practice, people do not normally write the instructions for computers directly in machine language but rather use a "high level" programming language which is then translated into the machine language automatically by special computer programs (interpreters and compilers). Some programming languages map very closely to the machine language, such as assembler (low level languages); at the other end, languages like Prolog are based on abstract principles far removed from the details of the machine's actual operation (high level languages).
Some larger computers differ from the above model in one major respect - they have multiple CPUs and control units working simultaneously. Additionally, a few computers, used mainly for research purposes and scientific computing, have differed significantly from the above model, but they have found little commercial application.
The functioning of a computer is therefore in principle quite straightforward. The computer fetches instructions and data from its memory. The instructions are executed, the results are stored, and the next instruction is fetched. This procedure repeats until the computer is turned off.
Nowadays, most computers appear to execute several programs at the same time. This is usually referred to as multitasking. In reality, the CPU executes instructions from one program, then after a short period of time, it switches to a second program and executes some of its instructions. This small interval of time is often referred to as a time slice. This creates the illusion of multiple programs being executed simultaneously by sharing the CPU's time between the programs. This is similar to how a movie is simply a rapid succession of still frames. The operating system is the program that usually controls this time sharing.
The operating system, for example, decides which programs get to run, and when, and what resources (such as memory or I/O) they get to use. The operating system also provides services to other programs, such as code ("drivers") which allow programmers to write programs for a machine without needing to know the intimate details of all attached electronic devices.
Now widely used programs are starting to be included in the operating system just because it is an economical way to distribute them. It's now commonplace for operating systems to include web browsers, text editors, e-mail programs, network interfaces, movie-players and other programs that were once quite exotic special-order programs.
People in governments and large corporations also used computers to automate many of the data collection and processing tasks previously performed by humans - for example, maintaining and updating accounts and inventories. In academia, scientists of all sorts began to use computers for their own analyses. Continual reductions in the costs of computers saw them adopted by ever-smaller organizations. Businesses, organizations, and governments often employ a large number of small computers to accomplish tasks that were previously done by an expensive, large mainframe computer. Collections of the smaller computers in one location is referred to as a server farm.
With the invention of the microprocessor in the 1970s, it became possible to produce very inexpensive computers. Personal computers became popular for many tasks, including keeping books, writing and printing documents. Calculating forecasts and other repetitive math with spreadsheets, communicatiing with e-mail and, the Internet. However, computers' wide availability and easy customization has seen them used for many other purposes.
At the same time, small computers, usually with fixed programming, began to find their way into other devices such as home appliances, automobiles, aeroplanes, and industrial equipment. These embedded processors controlled the behaviour of such devices more easily, allowing more complex control behaviours (for instance, the development of anti-lock brakes in cars). By the start of the twenty-first century, most electrical devices, most forms of powered transport, and most factory production lines are controlled by computers. Most engineers predict that this trend will continue.
For instance "computer" was once commonly used to mean a person employed to do arithmetic calculations, with or without mechanical aids. According to the Barnhart Concise Dictionary of Etymology, the word came into use in English in 1646 as a word for a "person who computes" and then by 1897 also for a mechanical calculating machine. During World War II it referred e.g. to U.S. and British servicewomen whose job it was to calculate the trajectories of large artillery shells with such machines.
Charles Babbage designed one of the first computing machines called the Analytical engine, but due to technological problems it was not built in his lifetime. Various simple mechanical devices such as the slide rule kind have also been called computers. In some cases they were referred to as "analog computers", as they represented numbers by continuous physical quantities rather than by discrete binary digits. What are now called simply "computers" were once commonly called "digital computers" to distinguish them from these other devices (which are still used in the field of analog signal processing, for example).
In thinking of other words for the computer, it is worth noting that in other languages the word chosen does not always have the same literal meaning as the English language word. In French for example, the word is "ordinateur", which means approximately "organizer", or "sorting machine". The Spanish word is "ordenador" , with the same meaning, although in some countries they use the anglicism computadora. In Italian, computer is "calcolatore", calculator, emphasizing its computational uses over logical ones like sorting. In Swedish, a computer is called "dator" from "data". At least in the 1950s, they were called "matematikmaskin" ("mathematics machine"). In Chinese, a computer is called "dian now" or an "electric brain". In English, other words and phrases have been used, such as "data processing machine".