People have long recognized the competitive advantages that could be realized by having available more efficient data storage and computational ability. From counting on the fingers, to making marks on the walls of caves, to the invention of picture numbers, to the modern check or banknote, there has been a steady progression away from directly manipulating the objects computations describe and toward the use of abstractions to represent the originals. Mechanical devices have played an important part in this progression. More than one culture has come up with the idea of placing beads on a string (the abacus). In some places, these are still the preferred calculating device after several thousand years. A skilled operator can calculate the cost of a large number of purchases on an abacus much faster than most people can enter them into a calculator.
Some who have studied the ancient British monument known as Stonehenge have come to the conclusion that it was an enormous calculating device for making astronomical predictions. Other monuments left by the Babylonians, South and Central American Indians, and South Sea Islanders may have had similar purposes. The Scottish mathematician John Napier (1550-1617) devised Napier's bones and published tables of logarithms intended to simplify tedious arithmetic computations. These led directly to the wooden or bamboo slide rule, known and loved by many student generations prior to the development of inexpensive electronic calculators.
To the French mathematician and theologian Blaise Pascal (1623-1662) goes the honour of inventing the first mechanical adding machine (1642). It was based on a system of gears similar to those in a modern automobile odometer and was used for computing taxes. However, parts for the device could not be manufactured with sufficient precision to make it practical, and it never became widely used. About thirty years later, the famous German mathematician and co-inventor (with Newton) of calculus, Gottfried Wilhelm von Leibniz (1646-1716), made a similar but more reliable machine that could not only add and subtract but also multiply, divide, and calculate square roots. Many people improved calculating machines over the next century, and by 1900 they had an important place in government and commerce. But as late as the mid 1960s electromechanical versions of these calculators could do only basic four function arithmetic, weighed thirty pounds, and took up half a desktop.
Meanwhile, another idea important to the modern computer was emerging--that of the stored program or instruction sequence. This idea arose in connection with the development of automatic looms by the French inventor Joseph Marie Jacquard (1752-1854). First shown at the 1801 Paris Exhibition, these looms used a collection of punched metal cards to control the weaving process. The machine, with variations, is still used today, though it is now controlled by punched paper cards or tapes, or by direct connection to a microcomputer.
The first computer--a machine combining computational ability with stored programs--was designed by the British mathematician Charles Babbage (1792-1871). He worked on his "Difference Engine" for about eleven years before abandoning the project. Later, he designed a much more ambitious "Analytical Engine" that was intended to be an algebraic analogue of Jacquard's loom. Although Babbage even had a programmer for the engine (Lord Byron's daughter, Ada Augusta, the Countess of Lovelace), this machine was never constructed in his lifetime. Its concepts were not realized until 1944 when the Mark I computer was developed in the United States.
By this time, the punched paper medium had become standardized through the work of Herman Hollerith. He devised a card data storage and sorting system for the U.S. Census Bureau, which was first employed in the 1890 census. Hollerith left the bureau six years later to form his own company, the name of which was changed to International Business Machines in 1924.
Meanwhile, vacuum-tube technology had developed to the point where an electronic computer could be manufactured. The first of these were the British code-breaking devices Colossus Mark I and Colossus Mark II built in 1943 and 1944 for the British intelligence service at Bletchley Park. The latter attained speeds not matched by other computers for a decade. When the war was over, these machines were dismantled and their parts sold as surplus.
At about the same time, the groundwork of a number of researchers in the United States came to fruition in the construction of the Electronic Numerical Integrator and Calculator (ENIAC) by J. P. Eckert and J. W. Mauchly at the University of Pennsylvania. This machine, which contained over 18,000 vacuum tubes, filled a room six meters by twelve meters and was used principally by military ordnance engineers to compute shell trajectories. In subsequent years, many similar computers were developed in various research facilities in the United States and Britain. Such devices, which generally were limited to basic arithmetic, required a large staff to operate, occupied vast areas of floor space, and consumed enormous quantities of electricity.
Eckert and Mauchly were also responsible for the first commercial computer, the Universal Automatic Computer (UNIVAC), which they manufactured after leaving the university. Their company was eventually incorporated into Sperry (now merged with Burroughs to become UNISYS), which still manufactures large industrial computers. Today, those early vacuum-tube monsters are referred to as "first-generation computers," and the machines that are their successors are called "mainframes."
The transistor, developed by Bell Labs in late 1947, and its improvement during the early 1950s, was designed to replace the vacuum tube, reducing both electrical consumption and heat production. This led to miniaturization of many electronic devices, and the size of typical computers shrank considerably, even as their power increased. Transistorized machines built between 1959 and 1965 formed the second generation of computers.
Price were still in the hundreds of thousands to millions of dollars, however, and such machines were generally seen at first only in headquarters of large research and government organizations. Even by the mid-1960s, not all universities had even one computer, and those that did often regarded them as exclusive toys for the mathematicians and research scientists. There were occasional courses at the fourth-year level, but freshman introductions to computer science had not yet become popular.
Invention of the integrated circuit dramatically changed things in the computing world. The first result was another, even more significant size reduction, for what once took up several floors of a large building now occupied a small box. The first of these third-generation computers was the IBM System 360, which was introduced in 1964 and quickly became popular among large businesses and universities This size reduction also resulted in the first "pocket" calculators, which appeared on the market in the early 1970s. Even at the initial price of several hundred dollars, these put into the hands of the average person more computing power than the first UNIVAC had possessed. New models proliferated so rapidly and so many new features were incorporated into the pocket calculator that one company decided to have a chip designed that would allow it to program new functions so as to cut down the time necessary to bring a new model to market.
The chip, called the 4004, gave way to the 8008, and then to the 8080 and 8080A. The latter became the backbone of the new small-computer industry, as numerous companies developed kits and fully assembled computers. In its later incarnations by Zilog as the Z-80 and other descendants, such as the 8085, 8088, 8086, and now the 80186, 80286, 80386, 80486, Pentium, and P6, this invention lives on in millions of microcomputers. Not long after the 8080 became a commercial reality, Motorola developed the 6800 chip, which had the advantage to programmers of being cheaper and somewhat easier to work with than the 8080. It, too, became popular for a time, but soon gave way to other designs.
At about the same time the Z-80 was developed, the 6501 and 6502 chips were derived from the 6800 as low-cost industrial process controllers. In 1976, the 6502 was also used to build a small computer, this one entirely contained on a single board. It was called the Apple, and Apple Computer Corporation went on to sell millions of the Apple ][ and its descendents, the ][+, //e, //c and //GS, surpassing all other manufacturers of small computers in the process, and becoming the sole source for nearly every important advance in small computer technology for two decades.
In 1977, Radio Shack joined the competition with its Z-80 based machines. In Europe, the equivalent popularizing role was played by Commodore (a Canadian company) and by Sinclair (a British firm). A few years later, IBM came into this market with the 8088-based PC. The mere presence of the giant changed the whole market for a time, with most other manufacturers seeking to make machines compatible with those of IBM. Eventually some of these "clone" makers, such as Compaq, became a larger presence in the market than IBM itself. By the late 1990s, the machines generating the most attention were capable of storing more and manipulating larger numbers than anything previously seen in the microcomputer market. They were also capable of handling processing requirements of the graphics user interface (GUI) first realized in the Xerox Star, Apple Lisa and Macintosh, then in Commodore's Amiga and Atari's machines, and now employed by most computer users. Integration of circuits had now reached the point where millions of components were being crammed into a single chip. Between 1987 and 1991, major new commitments were made by Apple with the Motorola 68030 and 68040-based Macintosh models and by IBM with their OS/2 machines. With the latter, IBM also followed Apple's lead into graphics-oriented software, helping to ensure this style of interface a continuing acceptance in the marketplace. Graphics user interfaces were also adopted by the makers of scientific workstations such as those made by Sun Microsystems, and were being attached to other machines running the UNIX operating system.
In the early 1990s, Microsoft, already the dominant manufacturer of operating systems for Intel 80x86 chips and of applications for both these and Macintosh platforms, began to market a GUI called Windows that was a rough copy of the Macintosh Operating System. The courts ruled, however, that it was not a close enough imitation to fall under copyright law, and Windows (in various flavours) gradually became dominant on Intel based machines (sometimes now called "Wintel" systems).
By 1995, Apple had formed partnerships with Motorola and IBM to develop new microprocessor technology and was already marketing machines based on the new PowerPC RISC chip, while IBM was porting its operating systems to the new chip as well. The two were readying new operating systems and preparing specifications for a common hardware platform on which to run them. Apple had licensed its operating system and the first Macintosh clones were appearing on the market--some from very well known consumer companies such as Motorola. Micrcomputers had become powerful enough that the minicomputer category had been all but crowded out of the market on price/performance considerations.
By 2002 Microsoft had moved through Windows 95, 98 and NT to Windows 2000 (ME). The world had also seen the demise of OS/2, and the migration of the MacOS to a new UNIX-based OS (NextStep, later rebuilt and renamed OS X) developed by Steve Jobs--the once ousted co-founder of Apple. At the same time, Apple had transitioned to the RISC-based G4 PowerPC chip and was offering machines whose raw processing power would once have placed them in the supercomputer category, Meanwhile, in its lower priced line, Apple had made computers into fashion statements, an innovation others were also quick to copy.
While much of the marketing activity and most headlines focused on the microcomputer segment of the industry, larger machines had undergone startling changes also. Fourth generation supercomputers could be used in situations where calculation complexity or data quantity is so great as to be beyond the ability of ordinary mainframe devices. These machines are used by governments, the military, and in academic research institutions. Still newer generations of computers are on drawing boards in the United States and Japan, and many of the new developments will undoubtedly filter down to become consumer-oriented devices in the future. At the same time, however, desktop computers, with their ever-faster chips and larger memories, were encroaching on application domains once thought to belong only to supercomputers.
At the opposite end of the scale, pocket sized computing devices had also become important. These ranged from the DOS or Windows-based miniaturized version of the desktop sibling to the specialized personal time and communications organizer (Personal Digital Assistant or PDA). Also called the Personal Intelligence Enhancement Appliance (PIEA) these devices boast handwriting recognition, wireless communications abilities, and sophisticated time management functions. Apple's Newton was a key player and innovator in this market, but the 3Com/Palm Pilot eventually took it over.
For most applications in the near future, however, microprocessor-based computing devices will have sufficient power to suit the majority of individual, academic, and business uses. They are inexpensive, easy to link (network) for sharing other resources (storage devices and printers), and they run languages and other programs similar to those found on mainframe computers. Much development work (particularly in programming and publishing) is being done with microcomputers in view, and it is safe to predict descendants of these machines are the ones most people will be referring to when they speak about computers in the future.
Larger machines will also continue to grow and change, as will organizations depending on them. Moreover, computers of the future will be as different from those of today as these are from ones of the late 1940s. They will be smaller (down to pocket size), faster, and with greater storage capacity. They will be integrated with video and communications technology to give immediate access to worldwide databases. They will undoubtedly become easy to use, and at some point the need to offer university level courses in their operation will cease, for they will have become common technical appliances.
So broad and diverse have the applications of electronic processors become that "computer" seems a misnomer, for the machines in which such devices are embedded spend little time calculating, and much more finding, organizing, preparing and communicating data. In this respect, the Internet, especially the portion known as the World Wide Web (WWW) has become a kind of prototype for the universal distributed library of the future, and most organizations have connections, for e-mail if for nothing else.
Computers have already profoundly changed many of society's institutions (business, banking, education, libraries). They will have even greater effects on institutions in the future. They have also raised or caused new ethical issues, and these will need to be addressed in the interests of social stability. In addition, developments in computing have affected or given rise to other new products and methods in a variety of fields, further demonstrating the interdependence of ideas, society, and technology.
There are microprocessors in stereos, televisions, automobiles, toys and games. Entertainment and telecommunications industries are heavily dependent on new electronic technologies. Computers themselves are directly attached to research instruments that gather and interpret data in basic physics, chemistry, and biology experiments. The resulting changes and advances in scientific research have also caused profound effects on society and its institutions. They have resulted in new social and ethical questions being raised, whose very asking could not have been anticipated in the industrial age. These include issues relating to software copyright, data integrity, genetic engineering, artificial intelligence, displacement of human workers by robots, how to live in and manage an information-based society, and how to repair damage wrought in the industrial age.
Technical trends and possible social and ethical consequences will be examined and extrapolated in more detail in later sections of the book. It is at least possible to conclude at this point that advent of the fourth civilization (aka "the information age") is owed more to the modern computer than to any other single invention of the late industrial period.