History of Computing

Sam Lightstone

Sam Lightstone

Distinguished Engineer at IBM
BIO: Sam Lightstone is Distinguished Engineer for relational Cloud Data repositories as well as co-founder of IBM's technology incubation initiative.He is author or several books, papers, and patents on computer science, careers, and data engineering.
Sam Lightstone

The growth of computer processing power

In 1965 Gordon E. Moore predicted that the number of transistors the industry would be able to place on a computer chip would double every year.  In 1968 Moore co-founded Intel, serving initially as Executive Vice President. In 1975, he updated his prediction about transistor density to doubling once every two years. This prediction, known as “Moore’s Law”, became one of the most well known and predicable trends in the computing industry. Though the growth of transistor density will eventually drop off, the growth of computing power every two years may continue for decades to come through other technical advancements. Examining the growth of computational power per unit cost suggests the growth pattern has a reasonable chance of extending beyond the limits of VLSI transistor density. Electronic computing predates the development of the integrated circuit, and mechanical forms of computing go back to the early 1900’s. Graphing computational growth by cost (see figure below) the startling observation is that the growth trend does not appear to be specific to the VLSI era. This suggests that the growth in computation power may likely continue far beyond the time when we have reached the maximum density of transistors in VLSI that the laws of physics allow. While we can’t predict how this will be done, nor can we state with certainty that is will happen, the historical trend suggests it is distinctly possible.

  

If the growth in computational power continues we can expect a $1000 home computer to have the processing power (computations per second) of:

  • a house fly by the year 2000,
  • a rodent by the year 2005,
  • a human being by 2020,
  • all human minds on the planet by 2060.

Fortunately for humanity processing speed is not identical to intelligence, though some popular proponents of Artificial Intelligence (AI) suggest the only difference is a small matter of programming and sensory perception. For a somewhat fluffy but entertaining read on this topic, try Ray Kurzweil’s “The Age of Spiritual Machines”.

A short history of computer technology evolution

1945 ENIAC computer is developed as part of the WWII war effort to help calculate trajectories for missiles. Size: filled an entire room. 16,200 cubic feet. Weight: thirty tons. Power: two hundred kilowatts. Operational power cost: $650/hr in electricity when running idle. Circuitry: 19,000 Vacuum tubes, 1,500 relays, 100,000s of resistors, capacitors, and inductors. Input/output: An IBM card reader and card punch were used respectively.
1947 John Bardeen, Walter H. Brattain, and William B. Shockley of Bell Telephone Laboratories invent the transistor. Arguably the most important invention of the past 2000 years. The transistor exploits semi conductor solid-state physics resulting in an electronic device with no moving mechanical parts that acts as a switch. The switch is “ON” when voltage is applied, and “OFF” when voltage is not applied. The ability to manufacture high performance electronic “switches” enabled the developed of electronic logic gates, electronic memory and electronic clocking, the three components that make modern computing possible.
1951 UNIVAC becomes the first commercial computer.
1952 Admiral Grace Hopper develops the first computer compiler, leading to the creation of user-friendly languages and opening the door to a larger universe of computer applications and users.
1954 Gene Amdahl develops the first computer operating system for the IBM 704.
1955 Reynold Johnson develops the first disk drive.
1958 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently develop the integrated circuit (IC).
1958 Seymour Cray of Control Data Corp. develops the first transistorized computer.
1961 Silicon chips first appear. The ability to manufacture many transistors into small chips, greatly reducing the size and power requirements for computers.
1963 Douglas Englebart, SRI, patents the idea of the computer mouse.
1964 IBM releases its Model 360 computer, which will result in $100 billion in sales over its life cycle.
1964 Intel Chairman Gordon Moore suggests that integrated circuits would double in complexity (number of transistors) every 18 months.
1971 Intel introduces its popular 4004 4-bit microprocessor, starting the evolution of Intel’s famous line of 386, 486, and Pentium processors.
1973 Xerox develops first Ethernet Local Area Network (LAN) technology.
1975 Bill Gates and Paul Allen found Microsoft.
1976 Steve Jobs and Steve Wozniak form the Apple Computer Company.
1976 Alan Shugart, IBM, invents the 5.25-inch floppy.
1980 Philip Estridge, IBM, develops the first hard drive for PCs. It holds 10MB.
1981 IBM introduces the PC.
1984 Apple Computer introduced the Macintosh personal computer January 24.
1988 In 1988, RAID levels 1 through 5 were formally defined by David A. Patterson, Garth A. Gibson and Randy H. Katz in the paper, “A Case for Redundant Arrays of Inexpensive Disks (RAID)” creating a quantum leap forward in disk storage reliability.
1991 Tim Berners-Lee developes the World-Wide Web (WWW) and it is released by CERN.
1993 Marc Andreesen and Eric Bina at NCSA  create the first Web browser called Mosaic
1994 Netscape Navigator 1.0 was released Dec. 1994. Because it was given away free it quickly gained 75% of worldwide browser market.
1996 Microsoft finally released their own Web browser Explorer 3.0
2002 64-bit addressability becomes common  making memory addressability a non issue for most practical programming purposes
2004 Google publishes the first paper on MapReduce, opening the era of Big Data processing.
2005 Intel introduced the first dual-core CPU, a microprocessor with two processors inside. Declares multi-core chips are the wave of the future.
2005 Hadoop was created by Doug Cutting and Mike Cafarella.
2006 Nov 2006, Intel announces the first x86 quad-core CPU
2006 Graphical Processing Units (GPUs) originally created for gaming start extending to several additional domains, offering extreme parallel processing at lower electrical power consumption. Machine learning, oil exploration, scientific image processing, linear algebra, statistics, 3D reconstruction, financial modeling and data analytics
2011 IBM’s David Ferruci leads the development of Watson, an artificially intelligent computer system capable of answering questions posed in natural language. Watson competed on Jeopardy! against former winners Brad Rutter and Ken Jennings, winning the game and $1 million.
2013 Cloud computing goes mainstream. 75% of IT companies are using or planning cloud deployments.