The Computer: Despite Our Giddy Fascination with This Ubiquitous Machine, We Still Underestimate Its Ability to Change Our Lives

Article excerpt

Despite our giddy fascination with this ubiquitous machine, we till underestimate its ability to change our lives.

AS THE CENTURY COMES TO a close, the technology that obsesses us, captivates us, infuriates us and dominates us is the computer. But ultimately, this most amazing of inventions won't be seen as an artifact of the old millennium but the defining force of the one just dawning. Do you really think that we' re already into the computer age? That's a gross underestimation of what the computer will eventually do to change our world, our lives and perhaps the nature of reality itself.

Underestimation, as it turns out, has been a constant in the brief but dazzling history of this amazing machine. Surprisingly, the tale begins in the 19th century, when Charles Babbage, an English mathematician born in 1791, launched a lifelong quest to build information-processing machines-first a calculator called the Difference Engine and then a more elaborate programmable device dubbed the Analytical Engine. He lacked--among other things-- electricity, transistors, keyboards and Bill Gates. Yet in the 1830s he came astonishingly close to producing something very much like the computers that would be celebrated decades after he died. Unfortunately, his skill at innovation was not matched by an ability to generate venture capital, and his plans were tossed into the unforgiving core dump of history.

The idea of a programmable machine that performed humanity's mental labors reappeared in the 1930s. Specifically, the breakthrough came at the hands of another eccentric English mathematician, Alan Turing, who outlined how it was possible to build something that could perform virtually any mathematical task that one could describe. His proof involved an ingenious imaginary device that would be known as the Universal Turing Machine- essentially, a machine that could duplicate the work of any other machine. Even if the "machine" were a human calculator. Turing knew what the rest of us are still trying to wrap our minds around- such a contraption, a computer, can do anything. It's an invention that breeds invention itself.

But it took a war to bring about the physical devices that would be known as the first real computers. (A small but noisy controversy among computer historians involves whether a device constructed in 1939 by John Atanasoff and his student at Iowa State University, Clifford Berry, deserves the true mantle of First Electronic Computer.) In England Turing himself worked on machines that helped crack the secret codes used by the Germans. In Germany itself, a wizard named Konrad Zuse was working on that country's computing effort but never fully realized his ideas. And in America, a Hungarian genius named John von Neumann--perhaps the premier mathematician of this century-- was pondering mechanical devices to help perform the calculations required for the Manhattan Project. A chance meeting at a train platform in 1944 led him to a team of scientists working at the University of Pennsylvania to create ENIAC (Electronic Numerical Integrator and Computer), which many people consider the true Adam of computers. Designed by J. Presper Eckert and John Mauchly to help crunch numbers for artillery-target estimates, this device used 18,000 vacuum tubes and cost $400,000.

Von Neumann was fascinated, and he worked with the ENIAC people to take computing to the next level: EDVAC, which was essentially a blueprint for the machines that followed: memory, stored programs and a central processor for number crunching. This scheme was sufficiently versatile to launch computers into the commercial realm. But even then, underestimation was as thick as in Babbage's day. Thomas Watson Sr., the head of the company that was perhaps most prescient of all in embracing the idea- IBM--thought it unimaginable that there would ever be a worldwide need for the machine. "I think there is a world market," said Watson, "for maybe five computers. …