History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

China's upgraded light-powered 'AGI chip' is now a million times more efficient than before, researchers say

Quantum compasses closer to replacing GPS after scientists squeeze key refrigerator-sized laser system onto a microchip

Al Naslaa rock: Saudi Arabia's enigmatic sandstone block that's split perfectly down the middle

Most Popular

  • 2 AI 'hallucinations' can lead to catastrophic mistakes, but a new approach makes automated decisions more reliable
  • 3 2,200-year old battering ram from epic battle between Rome and Carthage found in Mediterranean
  • 4 Arctic expedition uncovers deep-sea microbes that may harbor the next generation of antibiotics
  • 5 SpaceX Falcon 9 rocket grounded for 2nd time in 2 months following explosive landing failure

research the history of computer

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction & Top Questions

Analog computers

Mainframe computer.

  • Supercomputer
  • Minicomputer
  • Microcomputer
  • Laptop computer
  • Embedded processors
  • Central processing unit
  • Main memory
  • Secondary memory
  • Input devices
  • Output devices
  • Communication devices
  • Peripheral interfaces
  • Fabrication
  • Transistor size
  • Power consumption
  • Quantum computing
  • Molecular computing
  • Role of operating systems
  • Multiuser systems
  • Thin systems
  • Reactive systems
  • Operating system design approaches
  • Local area networks
  • Wide area networks
  • Business and personal software
  • Scientific and engineering software
  • Internet and collaborative software
  • Games and entertainment
  • Analog calculators: from Napier’s logarithms to the slide rule
  • Digital calculators: from the Calculating Clock to the Arithmometer
  • The Jacquard loom
  • The Difference Engine
  • The Analytical Engine
  • Ada Lovelace, the first programmer
  • Herman Hollerith’s census tabulator
  • Other early business machine companies
  • Vannevar Bush’s Differential Analyzer
  • Howard Aiken’s digital calculators
  • The Turing machine
  • The Atanasoff-Berry Computer
  • The first computer network
  • Konrad Zuse
  • Bigger brains
  • Von Neumann’s “Preliminary Discussion”
  • The first stored-program machines
  • Machine language
  • Zuse’s Plankalkül
  • Interpreters
  • Grace Murray Hopper
  • IBM develops FORTRAN
  • Control programs
  • The IBM 360
  • Time-sharing from Project MAC to UNIX
  • Minicomputers
  • Integrated circuits
  • The Intel 4004
  • Early computer enthusiasts
  • The hobby market expands
  • From Star Trek to Microsoft
  • Application software
  • Commodore and Tandy enter the field
  • The graphical user interface
  • The IBM Personal Computer
  • Microsoft’s Windows operating system
  • Workstation computers
  • Embedded systems
  • Handheld digital devices
  • The Internet
  • Social networking
  • Ubiquitous computing

A laptop computer

What is a computer?

Who invented the computer, what can computers do, are computers conscious, what is the impact of computer artificial intelligence (ai) on society.

Technical insides of a desktop computer

Our editors will review what you’ve submitted and determine whether to revise the article.

  • University of Rhode Island - College of Arts and Sciences - Department of Computer Science and Statistics - History of Computers
  • LiveScience - History of Computers: A Brief Timeline
  • Computer History Museum - Timeline of Computer history
  • Engineering LibreTexts - What is a computer?
  • Computer Hope - What is a Computer?
  • computer - Children's Encyclopedia (Ages 8-11)
  • computer - Student Encyclopedia (Ages 11 and up)
  • Table Of Contents

A laptop computer

A computer is a machine that can store and process information . Most computers rely on a binary system , which uses two variables, 0 and 1, to complete tasks such as storing data, calculating algorithms, and displaying information. Computers come in many different shapes and sizes, from handheld smartphones to supercomputers weighing more than 300 tons.

Many people throughout history are credited with developing early prototypes that led to the modern computer. During World War II, physicist John Mauchly , engineer J. Presper Eckert, Jr. , and their colleagues at the University of Pennsylvania designed the first programmable general-purpose electronic digital computer, the Electronic Numerical Integrator and Computer (ENIAC).

What is the most powerful computer in the world?

As of November 2021 the most powerful computer in the world is the Japanese supercomputer Fugaku, developed by RIKEN and Fujitsu . It has been used to model COVID-19 simulations.

How do programming languages work?

Popular modern programming languages , such as JavaScript and Python, work through multiple forms of programming paradigms. Functional programming, which uses mathematical functions to give outputs based on data input, is one of the more common ways code is used to provide instructions for a computer.

The most powerful computers can perform extremely complex tasks, such as simulating nuclear weapon experiments and predicting the development of climate change . The development of quantum computers , machines that can handle a large number of calculations through quantum parallelism (derived from superposition ), would be able to do even more-complex tasks.

A computer’s ability to gain consciousness is a widely debated topic. Some argue that consciousness depends on self-awareness and the ability to think , which means that computers are conscious because they recognize their environment and can process data. Others believe that human consciousness can never be replicated by physical processes. Read one researcher’s perspective.

Computer artificial intelligence's impact on society is widely debated. Many argue that AI improves the quality of everyday life by doing routine and even complicated tasks better than humans can, making life simpler, safer, and more efficient. Others argue AI poses dangerous privacy risks, exacerbates racism by standardizing people, and costs workers their jobs leading to greater unemployment. For more on the debate over artificial intelligence, visit ProCon.org .

computer , device for processing, storing, and displaying information.

Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery . The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing. For details on computer architecture , software , and theory, see computer science .

Computing basics

The first computers were used primarily for numerical calculations. However, as any information can be numerically encoded, people soon realized that computers are capable of general-purpose information processing . Their capacity to handle large amounts of data has extended the range and accuracy of weather forecasting . Their speed has allowed them to make decisions about routing telephone connections through a network and to control mechanical systems such as automobiles, nuclear reactors, and robotic surgical tools. They are also cheap enough to be embedded in everyday appliances and to make clothes dryers and rice cookers “smart.” Computers have allowed us to pose and answer questions that were difficult to pursue in the past. These questions might be about DNA sequences in genes, patterns of activity in a consumer market, or all the uses of a word in texts that have been stored in a database . Increasingly, computers can also learn and adapt as they operate by using processes such as machine learning .

Technician operates the system console on the new UNIVAC 1100/83 computer at the Fleet Analysis Center, Corona Annex, Naval Weapons Station, Seal Beach, CA. June 1, 1981. Univac magnetic tape drivers or readers in background. Universal Automatic Computer

Computers also have limitations, some of which are theoretical. For example, there are undecidable propositions whose truth cannot be determined within a given set of rules, such as the logical structure of a computer. Because no universal algorithmic method can exist to identify such propositions, a computer asked to obtain the truth of such a proposition will (unless forcibly interrupted) continue indefinitely—a condition known as the “ halting problem .” ( See Turing machine .) Other limitations reflect current technology . For example, although computers have progressed greatly in terms of processing data and using artificial intelligence algorithms , they are limited by their incapacity to think in a more holistic fashion. Computers may imitate humans—quite effectively, even—but imitation may not replace the human element in social interaction. Ethical concerns also limit computers, because computers rely on data, rather than a moral compass or human conscience , to make decisions.

Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components ( see differential analyzer and integrator ), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation.

research the history of computer

One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed.

Digital computers

In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s ( binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States , Britain, and Germany . The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer .

During the 1950s and ’60s, Unisys (maker of the UNIVAC computer), International Business Machines Corporation (IBM), and other companies made large, expensive computers of increasing power . They were used by major corporations and government research laboratories, typically as the sole computer in the organization. In 1959 the IBM 1401 computer rented for $8,000 per month (early IBM machines were almost always leased rather than sold), and in 1964 the largest IBM S/360 computer cost several million dollars.

These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe.

Such systems remain important today, though they are no longer the sole, or even primary, central computing resource of an organization, which will typically have hundreds or thousands of personal computers (PCs). Mainframes now provide high-capacity data storage for Internet servers, or, through time-sharing techniques, they allow hundreds or thousands of users to run programs simultaneously. Because of their current roles, these computers are now called servers rather than mainframes.

  • Random article
  • Teaching guide
  • Privacy & cookies

A model of a Babbage-style Difference Engine at the Computer History Museum. Photo by Cory Doctorow.

A brief history of computers

by Chris Woodford . Last updated: January 19, 2023.

C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same.

Read on to learn more about the history of computers—or take a look at our article on how computers work .

Listen instead... or scroll to keep reading

Photo: A model of one of the world's first computers (the Difference Engine invented by Charles Babbage) at the Computer History Museum in Mountain View, California, USA. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Cogs and Calculators

It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French scientist and philosopher Blaise Pascal (1623–1666) invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs ( gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register."

Apart from developing one of the world's earliest mechanical calculators, Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. [1] In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones. But, in the 19th century, these ideas were still far ahead of their time. It would take another 50–100 years for mathematicians and computer scientists to figure out how to use them (find out more in our articles about calculators and logic gates ).

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism. Picture courtesy of US Library of Congress .

Engines of Calculation

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. A calculator is a device that makes it quicker and easier for people to do sums—but it needs a human operator. A computer, on the other hand, is a machine that can operate automatically, without any human help, by following a series of stored instructions called a program (a kind of mathematical recipe). Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards and paper tape to store information and feed it into a machine. Here's a drawing from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how a strip of paper (yellow) is punched with different patterns of holes (orange) that correspond to statistics gathered about people in the US census. Picture courtesy of US Patent and Trademark Office.

The first person to attempt this was a rather obsessive, notoriously grumpy English mathematician named Charles Babbage (1791–1871). Many regard Babbage as the "father of the computer" because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers. During his lifetime, Babbage never completed a single one of the hugely ambitious machines that he tried to build. That was no surprise. Each of his programmable "engines" was designed to use tens of thousands of precision-made gears. It was like a pocket watch scaled up to the size of a steam engine , a Pascal or Leibniz machine magnified a thousand-fold in dimensions, ambition, and complexity. For a time, the British government financed Babbage—to the tune of £17,000, then an enormous sum. But when Babbage pressed the government for more money to build an even more advanced machine, they lost patience and pulled out. Babbage was more fortunate in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable—and this is why she is still, sometimes, referred to as the world's first computer programmer. [2] Little of Babbage's work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas. Unfortunately, by then, most of these ideas had already been reinvented by others.

Artwork: Charles Babbage (1791–1871). Picture from The Illustrated London News, 1871, courtesy of US Library of Congress .

Babbage had intended that his machine would take the drudgery out of repetitive calculations. Originally, he imagined it would be used by the army to compile the tables that helped their gunners to fire cannons more accurately. Toward the end of the 19th century, other inventors were more successful in their effort to construct "engines" of calculation. American statistician Herman Hollerith (1860–1929) built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data. Then, as now, a census was taken each decade but, by the 1880s, the population of the United States had grown so much through immigration that a full-scale analysis of the data by hand was taking seven and a half years. The statisticians soon figured out that, if trends continued, they would run out of time to compile one census before the next one fell due. Fortunately, Hollerith's tabulator was an amazing success: it tallied the entire census in only six weeks and completed the full analysis in just two and a half years. Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired its present name: International Business Machines (IBM).

Photo: Keeping count: Herman Hollerith's late-19th-century census machine (blue, left) could process 12 separate bits of statistical data each minute. Its compact 1940 replacement (red, right), invented by Eugene M. La Boiteaux of the Census Bureau, could work almost five times faster. Photo by Harris & Ewing courtesy of US Library of Congress .

Bush and the bomb

Photo: Dr Vannevar Bush (1890–1974). Picture by Harris & Ewing, courtesy of US Library of Congress .

The history of computing remembers colorful characters like Babbage, but others who played important—if supporting—roles are less well known. At the time when C-T-R was becoming IBM, the world's most powerful calculators were being developed by US government scientist Vannevar Bush (1890–1974). In 1925, Bush made the first of a series of unwieldy contraptions with equally cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200 miles) of wire and 150 electric motors . Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days of wheel cranking and belt turning before the results finally emerged.

Impressive machines like the Differential Analyzer were only one of several outstanding contributions Bush made to 20th-century technology. Another came as the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured out how electrical circuits could be linked together to process binary code with Boolean algebra (a way of comparing binary numbers using logic) and thus make simple decisions. During World War II, President Franklin D. Roosevelt appointed Bush chairman first of the US National Defense Research Committee and then director of the Office of Scientific Research and Development (OSRD). In this capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative that led to the creation of the atomic bomb. One of Bush's final wartime contributions was to sketch out, in 1945, an idea for a memory-storing and sharing device called Memex that would later inspire Tim Berners-Lee to invent the World Wide Web . [3] Few outside the world of computing remember Vannevar Bush today—but what a legacy! As a father of the digital computer, an overseer of the atom bomb, and an inspiration for the Web, Bush played a pivotal role in three of the 20th-century's most far-reaching technologies.

Photo: "A gigantic mechanical slide rule": A differential analyzer pictured in 1938. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

Turing—tested

The first modern computers.

The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room. [4] The following year, American physicist John Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a great advance—1000 times more accurate than Bush's Differential Analyzer. These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits (although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number). These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays (electrically operated magnets that automatically switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may have been, but relays suffered from several problems: they were large (that's why the Harvard Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).

Photo: An analog computer being used in military research in 1949. Picture courtesy of NASA on the Commons (where you can download a larger version.

Most of the machines developed around this time were intended for military purposes. Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10,000 scientists from the United States alone. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down.

On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians based at Bletchley Park near London, England (including Alan Turing) built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (also known, especially in Britain, as a valve). The vacuum tube, each one about as big as a person's thumb (earlier ones were very much bigger) and glowing red hot like a tiny electric light bulb, had been invented in 1906 by Lee de Forest (1873–1961), who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. [5] In computers such as the ABC and Colossus, vacuum tubes found an alternative use as faster and more compact switches.

Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: the Electronic Numerical Integrator And Calculator (ENIAC). The ENIAC's inventors, two scientists from the University of Pennsylvania, John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), were originally inspired by Bush's Differential Analyzer; years later Eckert recalled that ENIAC was the "descendant of Dr Bush's machine." But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job (code-breaking); since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

Photo: Sir Maurice Wilkes (left), his collaborator William Renwick, and the early EDSAC-1 electronic computer they built in Cambridge, pictured around 1947/8. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late 1940s. Working with a brilliant Hungarian mathematician, John von Neumann (1903–1957), who was based at Princeton University, they then designed a better machine called EDVAC (Electronic Discrete Variable Automatic Computer). In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. [6] After EDVAC, Eckert and Mauchly developed UNIVAC 1 (UNIVersal Automatic Computer) in 1951. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper (1906–1992), who had originally been employed by Howard Aiken on the Harvard Mark I. Like Herman Hollerith's tabulator over 50 years before, UNIVAC 1 was used for processing data from the US census. It was then manufactured for other users—and became the world's first large-scale commercial computer.

Machines like Colossus, the ENIAC, and the Harvard Mark I compete for significance and recognition in the minds of computer historians. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late 1930s and the early 1950s. Among those other machines were pioneering computers put together by English academics, notably the Manchester/Ferranti Mark I, built at Manchester University by Frederic Williams (1911–1977) and Thomas Kilburn (1921–2001), and the EDSAC (Electronic Delay Storage Automatic Calculator), built by Maurice Wilkes (1913–2010) at Cambridge University. [7]

Photo: Control panel of the UNIVAC 1, the world's first large-scale commercial computer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

The microelectronic revolution

Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug." Popular legend has it that this word entered the vocabulary of computer programmers sometime in the 1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000. The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of any other existing computing machine." But developing computers that were an order of magnitude more powerful still would have needed hundreds of thousands or even millions of vacuum tubes—which would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

The solution appeared in 1947 thanks to three physicists working at Bell Telephone Laboratories (Bell Labs). John Bardeen (1908–1991), Walter Brattain (1902–1987), and William Shockley (1910–1989) were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors (materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways) to make a better form of amplifier than the vacuum tube. When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December 1947, they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the 1956 Nobel Prize in Physics . By that time, however, the three men had already gone their separate ways. John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in 1972. Walter Brattain moved to another part of Bell Labs.

William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce (1927–1990) and research chemist Gordon Moore (1929–). It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In 1956, eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. [8]

It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby (1923–2005) was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit (IC) , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.

Photo: An integrated circuit from the 1980s. This is an EPROM chip (effectively a forerunner of flash memory , which you could only erase with a blast of ultraviolet light).

Mainframes, minis, and micros

Photo: An IBM 704 mainframe pictured at NASA in 1958. Designed by Gene Amdahl, this scientific number cruncher was the successor to the 701 and helped pave the way to arguably the most important IBM computer of all time, the System/360, which Amdahl also designed. Photo courtesy of NASA .

Photo: The control panel of DEC's classic 1965 PDP-8 minicomputer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Integrated circuits, as much as transistors, helped to shrink computers during the 1960s. In 1943, IBM boss Thomas Watson had reputedly quipped: "I think there is a world market for about five computers." Just two decades later, the company and its competitors had installed around 25,000 large computer systems across the United States. As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components.

The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. In 1968, Robert Noyce and Gordon Moore had left Fairchild to establish a new company of their own. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction. A couple of their engineers, Federico Faggin (1941–) and Marcian Edward (Ted) Hoff (1937–), realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all. Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution.

Personal computers

By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts . With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.

After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs (1955–2011), persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced "Apple Two"). While the Altair 8800 looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April 1977, it was the world's first easy-to-use home "microcomputer." Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET. [9]

Apple's success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. It didn't take a VisiCalc spreadsheet to figure out that, if the trend continued, upstarts like Apple would undermine IBM's immensely lucrative business market selling "Big Blue" computers. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company's fortunes and stole the market back from Apple.

The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine's hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.

In 1976, Gary Kildall (1942–1994), a teacher and computer scientist, and one of the founders of the Homebrew Computer Club, had figured out a solution to this problem. Kildall wrote an operating system (a computer's fundamental control software) called CP/M that acted as an intermediary between the user's programs and the machine's hardware. With a stroke of genius, Kildall realized that all he had to do was rewrite CP/M so it worked on each different machine. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system. Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world's greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates (1955–). His then tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. Some believe Microsoft and IBM cheated Kildall out of his place in computer history; Kildall himself accused them of copying his ideas. Others think Gates was simply the shrewder businessman. Either way, the IBM PC, powered by Microsoft's operating system, was a runaway success.

Yet IBM's victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, starting making IBM-compatible (or "cloned") hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success. [10]

Photo: Personal computers threatened companies making large "mainframes" like this one. Picture courtesy of NASA on the Commons (where you can download a larger version).

The user revolution

Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in text commands, the Alto had a desktop-like screen with little picture icons that could be moved around with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")—an idea conceived by Alan Kay (1940–) and now used in virtually every modern computer. The Alto borrowed some of its ideas, including the mouse , from 1960s computer pioneer Douglas Engelbart (1925–2013).

Photo: During the 1980s, computers started to converge on the same basic "look and feel," largely inspired by the work of pioneers like Alan Kay and Douglas Engelbart. Photographs in the Carol M. Highsmith Archive, courtesy of US Library of Congress , Prints and Photographs Division.

Back at Apple, Jobs launched his own version of the Alto project to develop an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. With a retail price of $10,000, over three times the cost of an IBM PC, the Lisa was a commercial flop. But it paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell's novel 1984 , and directed by Ridley Scott (director of the dystopic movie Blade Runner ), Apple took a swipe at IBM's monopoly, criticizing what it portrayed as the firm's domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple's ad promised a very different vision: "On January 24, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984'." The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM's position.

Ironically, Jobs' easy-to-use machine also helped Microsoft to dislodge IBM as the world's leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh "look and feel" in all present and future versions of Windows. Microsoft's Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.

Photo: The IBM Blue Gene/P supercomputer at Argonne National Laboratory: one of the world's most powerful computers. Picture courtesy of Argonne National Laboratory published on Wikimedia Commons in 2009 under a Creative Commons Licence .

From nets to the Internet

Standardized PCs running standardized software brought a big benefit for businesses: computers could be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob Metcalfe (1946–) developed a new way of linking computers "through the ether" (empty space) that he called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help companies realize "Metcalfe's Law": computers become useful the more closely connected they are to other people's computers. As more and more companies explored the power of local area networks (LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by connecting computers over even greater distances—into so-called wide area networks (WANs).

Photo: Computers aren't what they used to be: they're much less noticeable because they're much more seamlessly integrated into everyday life. Some are "embedded" into household gadgets like coffee makers or televisions . Others travel round in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading "apps" (applications).

Today, the best known WAN is the Internet —a global network of individual computers and LANs that links up hundreds of millions of people. The history of the Internet is another story, but it began in the 1960s when four American universities launched a project to connect their computer systems together to make the first WAN. Later, with funding for the Department of Defense, that network became a bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of networking gave British computer programmer Tim Berners-Lee (1955–) his big idea: to combine the power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945. Thus, was born the World Wide Web —an easy way of sharing information over a computer network, which made possible the modern age of cloud computing (where anyone can access vast computing power over the Internet without having to worry about where or how their data is processed). It's Tim Berners-Lee's invention that brings you this potted history of computing today!

And now where?

If you liked this article..., don't want to read our articles try listening instead, find out more, on this site.

  • Supercomputers : How do the world's most powerful computers work?

Other websites

There are lots of websites covering computer history. Here are a just a few favorites worth exploring!

  • The Computer History Museum : The website of the world's biggest computer museum in California.
  • The Computing Age : A BBC special report into computing past, present, and future.
  • Charles Babbage at the London Science Museum : Lots of information about Babbage and his extraordinary engines. [Archived via the Wayback Machine]
  • IBM History : Many fascinating online exhibits, as well as inside information about the part IBM inventors have played in wider computer history.
  • Wikipedia History of Computing Hardware : covers similar ground to this page.
  • Computer history images : A small but interesting selection of photos.
  • Transistorized! : The history of the invention of the transistor from PBS.
  • Intel Museum : The story of Intel's contributions to computing from the 1970s onward.

There are some superb computer history videos on YouTube and elsewhere; here are three good ones to start you off:

  • The Difference Engine : A great introduction to Babbage's Difference Engine from Doron Swade, one of the world's leading Babbage experts.
  • The ENIAC : A short Movietone news clip about the completion of the world's first programmable electronic computer.
  • A tour of the Computer History Museum : Dag Spicer gives us a tour of the world's most famous computer museum, in California.

For older readers

For younger readers.

Text copyright © Chris Woodford 2006, 2023. All rights reserved. Full copyright notice and terms of use .

Rate this page

Tell your friends, cite this page, more to explore on our website....

  • Get the book
  • Send feedback

History of Computers logo

History of Computers: A Brief Timeline

Discover the fascinating history with our history of computers timeline, featuring key hardware breakthroughs from the earliest developments to recent innovations. Explore milestones such as Charles Babbage’s Analytical Engine, ENIAC, the transistor’s invention, the IBM PC’s introduction, and the revolutionary impact of artificial intelligence. This timeline highlights significant advancements that have shaped the evolution of computer technology and provides insights into how these innovations continue to influence our world today.

Charles Babbage’s Analytical Engine

Conceived in the early 1830s, the Analytical Engine represented a monumental leap in computational design, laying the groundwork for modern computing. An accomplished mathematician and mechanical engineer, Charles Babbage envisaged a machine capable of performing any arithmetic operation through programmable instructions. This concept was revolutionary for its time. Babbage’s initial… Read More

Konrad Zuse’s Z1

Konrad Zuse’s Z1 is a monumental achievement in the history of computing, marking the advent of programmable, mechanical computing devices. Constructed between 1936 and 1938 in Zuse’s parents’ living room, the Z1 was the first in a series of computers that would ultimately revolutionize the field of computer science. The… Read More

Steve Jurvetson from Menlo Park, USA, CC BY 2.0, via Wikimedia Commons.

In 1941, a significant milestone in the history of computing was achieved with the creation of the Atanasoff-Berry Computer (ABC), recognized as one of the first electronic digital computers. Developed by physicist and mathematician John Atanasoff and his graduate student, Clifford Berry, at Iowa State College (now Iowa State University),… Read More

Colossus

The period from 1943 to 1944 marked a significant milestone in the history of computing with the development of Colossus, one of the first programmable digital computers. British codebreakers primarily used this groundbreaking machine at Bletchley Park, the United Kingdom’s central site for cryptographic efforts during World War II. Colossus… Read More

ENIAC in Philadelphia, Pennsylvania. Glen Beck (background) and Betty Snyder (foreground) program the ENIAC in building 328 at the Ballistic Research Laboratory

In 1946, the world witnessed a monumental leap in computational technology with the unveiling of ENIAC (Electronic Numerical Integrator and Computer), heralded as the first general-purpose electronic digital computer. Developed by John W. Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC addressed the complex calculations required for… Read More

In 1997 Lucent Technologies created this replica to commemorate the 50th anniversary of the invention of the point-contact transistor at Bell Labs in December 1947.

The year 1947 marked a pivotal moment in the history of technology with the invention of the transistor by John Bardeen, Walter Brattain, and William Shockley at Bell Laboratories. This groundbreaking development effectively replaced vacuum tubes, which had been the cornerstone of electronic circuits up until that time. Vacuum tubes,… Read More

U.S. Census Bureau employees tabulate data using one of the agency’s UNIVAC computers, ca. 1960.

The year 1951 marked a significant milestone in the history of computing with the introduction of the UNIVAC I (Universal Automatic Computer I), the first commercially produced computer. Developed by J. Presper Eckert and John Mauchly, the UNIVAC I revolutionized data processing and set the stage for the modern computing… Read More

IBM 305 RAMAC system: IBM 305 main system (Processing unit, magnetic process drum, magnetic core register, electronic logical and arithmetic circuits), IBM 370 printer, IBM 380 console.

In the annals of computing history, 1956 stands out as a landmark year with the introduction of the IBM 305 RAMAC, the first computer to feature a hard disk drive. This pivotal innovation revolutionized data storage and retrieval, laying the foundation for modern data management systems. The IBM 305 RAMAC,… Read More

In 1959, Robert Noyce co-invented the integrated circuit.

In 1958, the groundbreaking invention of the integrated circuit by Jack Kilby and Robert Noyce marked a pivotal moment in the history of technology, revolutionizing the field of electronics and paving the way for the development of more compact and powerful computers. Before this innovation, electronic circuits were constructed using… Read More

DEC PDP-1 Demo Lab at Mountain View's Computer History Museum

In 1960, the Digital Equipment Corporation (DEC) introduced the PDP-1 (Programmed Data Processor-1), an early minicomputer that would become a milestone in the history of computing. The PDP-1 was revolutionary for its time, representing a significant departure from the monolithic, room-filling mainframes that dominated the computing landscape of the era.… Read More

U.S. Department of Agriculture (USDA) Statistical Reporting Service (SRS) Administrator Harry Trelogan looks on as Agriculture Secretary Orville Freeman tests out some of the functions of the IBM 360 computer. The Washington Data Processing Center officially opened on April 1, 1966 and uses the new IBM 360 computer to process data for all U.S. Department of Agriculture (USDA) agency programs. Photo courtesy of the National Archives and Records Administration. Note: This is a Model 40.

In 1964, IBM introduced the System/360, a groundbreaking family of computers that revolutionized the landscape of mainframe computing and established a new standard in the industry. Before the System/360 launch, the computer market was fragmented, with each system often requiring unique software and peripherals. This created significant inefficiencies and increased… Read More

Intel 4004 processor (open), manufactured by the Intel Corporation, United States, 1971.

In 1971, introducing the Intel 4004 microprocessor marked a pivotal moment in the annals of technological innovation, setting the stage for the microcomputer revolution that would transform industries and societies worldwide. As the first commercially available microprocessor, the Intel 4004 integrated the core functions of a central processing unit (CPU)… Read More

Xerox Alto computer

The year 1973 marked a significant milestone in the history of computing with the advent of the Xerox Alto, recognized as the first computer designed with a graphical user interface (GUI). Developed at Xerox’s Palo Alto Research Center (PARC), the Alto revolutionized how humans interacted with computers. Before its inception,… Read More

Altair 8800

In 1975, the Altair 8800 emerged as the first commercially successful personal computer, marking a significant milestone in the evolution of computing technology. Developed by Micro Instrumentation and Telemetry Systems (MITS) and designed by Ed Roberts, the Altair 8800 was a groundbreaking product that democratized computing by making it accessible… Read More

Apple I computer formerly owned by Joey Copson (d. 2003) now on display at the Deutsches Museum.

In 1976, a momentous chapter began in the annals of technology with the introduction of the Apple I, the inaugural product from Apple, Inc. This product not only marked the genesis of a company that would become a global powerhouse in personal computing but also signaled a paradigm shift in… Read More

HomeComputerMuseum - Commodore PET 2001

The year 1977 marked a significant milestone in the history of personal computing with the introduction of the Commodore PET, one of the first all-in-one home computers. For the Personal Electronic Transactor, the Commodore PET revolutionized the market by offering an integrated design that combined the keyboard, monitor, and data… Read More

The Sinclair ZX80.

In 1980, the introduction of the Sinclair ZX80 marked a pivotal moment in the history of personal computing, particularly in the United Kingdom. Developed by Sinclair Research Ltd., this compact and affordable home computer was among the first to make computing accessible to the general public. Before the ZX80, computers… Read More

IBM Personal Computer

The year 1981 marked a significant milestone in the history of computing with the release of the IBM Personal Computer (IBM PC). This groundbreaking product set the standard for PC architecture and revolutionized the industry. Developed by International Business Machines Corporation (IBM), the IBM PC was officially launched on August… Read More

Commodore 64

In 1982, the world witnessed the launch of the Commodore 64, a personal computer that would become one of the best-selling computers of all time. This pioneering machine, developed by Commodore Business Machines Inc., revolutionized the home computing industry with its remarkable blend of affordability, functionality, and expansive software library.… Read More

Steve Jobs and Macintosh computer, January 1984, by Bernard Gotfryd

In 1984, Apple Inc. revolutionized the personal computing industry by introducing the Macintosh, the first mass-market personal computer featuring a graphical user interface (GUI) and a mouse. This groundbreaking product represented a significant departure from the text-based interfaces that dominated the era, making computing more accessible and intuitive for the… Read More

Microsoft Windows 1.0

In 1985, Microsoft Corporation launched the first version of its Windows operating system, a milestone that would eventually transform the landscape of personal computing. This inaugural release, known as Windows 1.0, marked Microsoft’s initial foray into providing a graphical user interface (GUI) for IBM-compatible PCs, which were predominantly reliant on… Read More

This NeXT workstation (a NeXTcube, monitor Cern 57503) was used by Tim Berners-Lee as the first Web server on the World Wide Web. It is shown here as displayed in 2005 at Microcosm, the public science museum at CERN where Berners-Lee was working in 1991 when he invented the Web.

In 1989, a groundbreaking invention by Tim Berners-Lee transformed the landscape of global communication and information sharing—the World Wide Web. As a visionary software engineer working at CERN, the European Organization for Nuclear Research, Berners-Lee proposed a system that would allow for the seamless exchange of information across a network… Read More

Messages from the Linux kernel 3.0.0 booting, from Debian sid i386. The numbers to the left are timestamps (relative times are used as the clock will be set later). Prefixes with a colon refer to modules/subsystems.

In 1991, a pivotal moment in the history of computing occurred when Linus Torvalds, a Finnish computer science student, released the Linux Kernel. This release marked the inception of what would become one of the world’s most influential open-source operating systems. Torvalds initially developed The Linux Kernel as a personal… Read More

Intel Pentium Microprocessor

In 1993, Intel Corporation introduced the Pentium processor, a groundbreaking advancement in microprocessor technology that quickly became the standard for personal computers. The Pentium processor marked a significant leap forward from its predecessor, the 80486, by incorporating several key innovations that enhanced performance, efficiency, and functionality. One of its most… Read More

CDROM drive Dell.

In 1994, the Compact Disc-Read Only Memory (CD-ROM) established itself as a ubiquitous data storage and software distribution standard. This optical disc format, introduced in the early 1980s, gained significant traction during the 1990s as a cost-effective and reliable medium for disseminating large amounts of data. The CD-ROM’s capacity to… Read More

Palm Pilot

In 1996, the introduction of the Palm Pilot marked a significant turning point in personal technology, as it popularized the concept of handheld computing. This pioneering device, developed by Palm Inc., was far more than just a digital organizer; it was a harbinger of the mobile computing revolution that would… Read More

Apple iPod family

The year 2001 marked a significant milestone in digital music with the introduction of the Apple iPod. This innovative device revolutionized how music was stored, accessed, and enjoyed, fundamentally altering the landscape of the music industry. Before the iPod, music enthusiasts were largely dependent on bulkier CD players or limited-capacity… Read More

Apple iPhone

In 2007, Apple Inc. unveiled the iPhone, a groundbreaking mobile device that seamlessly integrated computing and communication, effectively ushering in the smartphone era. This revolutionary product was introduced by Steve Jobs on January 9th at the Macworld Conference & Expo, and it quickly captured the world’s imagination. The iPhone combined… Read More

Steve jobs holding an iPad during an Apple Live Event on 27 January 2010

In 2010, Apple introduced the iPad, a revolutionary device that significantly influenced and popularized the tablet computer market. Before its release, the concept of a tablet computer had been explored by various companies, but none had managed to capture the public’s imagination or achieve substantial commercial success. The iPad’s launch… Read More

The Chromebook Pixel from Google

In 2011, Google introduced the Chromebook, a revolutionary device that brought cloud-based computing into the mainstream. This innovative laptop, powered by Chrome OS, marked a significant shift from traditional computing paradigms by relying heavily on internet connectivity and cloud storage. The Chromebook was designed with simplicity and efficiency, offering an… Read More

HoloLens, Geste "Blüte" (Bloom). Präsentator: Andreas Erben

In 2015, Microsoft made a significant leap in technology by introducing the HoloLens, unveiling the mixed reality concept to the world. This groundbreaking device heralded a new era where digital content could be seamlessly blended with the real world, offering an immersive experience beyond virtual reality’s capabilities. Unlike virtual reality,… Read More

A Bristlecone chip being installed by Research Scientist Marissa Giustina at the Quantum AI Lab in Santa Barbara

In 2018, Google marked a significant milestone in quantum computing with the announcement of Bristlecone, a 72-qubit quantum computer. This announcement represented a pivotal moment in the advancement of quantum technology, setting a new benchmark for computational power and potential. Bristlecone’s development underscored Google’s commitment to pushing the boundaries of… Read More

Amazon Alexa display booth

The year 2020 marked a significant turning point in integrating Artificial Intelligence (AI) and Machine Learning (ML) into consumer and enterprise technology. This era witnessed the acceleration of these technologies being embedded into everyday applications, enhancing efficiency, productivity, and user experience across various sectors. Consumer technology saw an unprecedented adoption… Read More

Let’s explore

Interested in more computer history.

How about the History of Computers song?

September 1, 2009

11 min read

The Origin of Computing

The information age began with the realization that machines could emulate the power of minds

By Martin Campbell-Kelly

In the standard story, the computer’s evolution has been brisk and short. It starts with the giant machines warehoused in World War II–era laboratories. Microchips shrink them onto desktops, Moore’s Law predicts how powerful they will become, and Microsoft capitalizes on the software. Eventually small, inexpensive devices appear that can trade stocks and beam video around the world. That is one way to approach the history of computing—the history of solid-state electronics in the past 60 years.

But computing existed long before the transistor. Ancient astronomers developed ways to predict the motion of the heavenly bodies. The Greeks deduced the shape and size of Earth. Taxes were summed; distances mapped. Always, though, computing was a human pursuit. It was arithmetic, a skill like reading or writing that helped a person make sense of the world.

The age of computing sprang from the abandonment of this limitation. Adding machines and cash registers came first, but equally critical was the quest to organize mathematical computations using what we now call “programs.” The idea of a program first arose in the 1830s, a century before what we traditionally think of as the birth of the computer. Later, the modern electronic computers that came out of World War II gave rise to the notion of the universal computer—a machine capable of any kind of information processing, even including the manipulation of its own programs. These are the computers that power our world today. Yet even as computer technology has matured to the point where it is omnipresent and seemingly limitless, researchers are attempting to use fresh insights from the mind, biological systems and quantum physics to build wholly new types of machines.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

The Difference Engine In 1790, shortly after the start of the French Revolution, the government decided that the republic required a new set of maps to establish a fair system of property taxation.* He also ordered a switch from the old imperial system of measurements to the new metric system. To facilitate all the conversions, the French ordinance survey office began to compute an exhaustive collection of mathematical tables.

In the 18th century, however, computations were done by hand. A factory floor of between 60 and 80 human computers added and subtracted sums to fill in line after line of the tables for the survey’s Tables du Cadastre project. It was grunt work, demanding no special skills above basic numeracy and literacy. In fact, most computers were hairdressers who had lost their jobs—aristocratic hairstyles being the sort of thing that could endanger one’s neck in revolutionary France.

The project took about 10 years to complete, but by then, the war-torn republic did not have the funds necessary to publish the work. The manuscript languished in the Académie des Sciences for decades. Then, in 1819, a promising young British scientist named Charles Babbage would view it on a visit to Paris. Babbage was 28 at the time; three years earlier he had been elected to the Royal Society, the most prominent scientific organization in Britain. He was also very knowledgeable about the world of human computers—at various times he personally supervised the construction of astronomical and actuarial tables.

On his return to England, Babbage decided he would replicate the French project not with human computers but with machinery. England at the time was in the throes of the Industrial Revolution. Jobs that had been done by human or animal labor were falling to the efficiency of the machine. Babbage saw the power of this world of steam and brawn, of interchangeable parts and mechanization, and realized that it could replace not just muscle but the work of minds.

He proposed the construction of his Calculating Engine in 1822 and secured government funding in 1824. For the next decade he immersed himself in the world of manufacturing, seeking the best technologies with which to construct his engine.

In 1833 Babbage celebrated his annus mirabilis. That year he not only produced a functioning model of his calculating machine (which he called the Difference Engine) but also published his classic Economy of Machinery and Manufactures, establishing his reputation as the world’s leading industrial economist. He held Saturday evening soirees at his home in Devonshire Street in London, which were attended by the front rank of society. At these gatherings the model Difference Engine was placed on display as a conversation piece.

A year later Babbage abandoned the Difference Engine for a much grander vision that he called the Analytical Engine. Whereas the Difference Engine had been limited to the single task of table making, the Analytical Engine would be capable of any mathematical calculation. Like a modern computer, it would have a processor that performed arithmetic (the “mill”), memory to hold numbers (the “store”), and the ability to alter its function via user input, in this case by punched cards. In short, it was a computer conceived in Victorian technology .

Babbage’s decision to abandon the Difference Engine for the Analytical Engine was not well received, however, and the government demurred to supply him with additional funds. Undeterred, he produced thousands of pages of detailed notes and machine drawings in the hope that the government would one day fund construction. It was not until the 1970s, well into the computer age, that modern scholars studied these papers for the first time. The Analytical Engine was, as one of those scholars remarked, almost like looking at a modern computer designed on another planet.

The Dark Ages

Babbage’s vision, in essence, was digital computing. Like today’s devices, such machines manipulate numbers (or digits) according to a set of instructions and produce a precise numerical result.

Yet after Babbage’s failure, computation entered what English mathematician L. J. Comrie called the Dark Age of digital computing—a period that lasted into World War II. During this time, computation was done primarily with so-called analog computers, machines that model a system using a mechanical analog. Suppose, for example, an astronomer would like to predict the time of an event such as a solar eclipse. To do this digitally, she would numerically solve Kepler’s laws of motion. She could also create an analog computer, a model solar system made of gears and levers (or a simple electronic circuit) that would allow her to “run” time into the future.

Before World War II, the most sophisticated practical analog computing instrument was the Differential Analyzer, developed by Vannevar Bush at the Massachusetts Institute of Technology in 1929. At that time, the U.S. was investing heavily in rural electrification, and Bush was investigating electrical transmission. Such problems could be encoded in ordinary differential equations, but these were very time-consuming to solve. The Differential Analyzer allowed for an approximate solution without any numerical processing. The machine was physically quite large—it filled a good-size laboratory—and was something of a Rube Goldberg construction of gears and rotating shafts. To “program” the machine, technicians connected the various subunits of the device using screwdrivers, spanners and lead hammers. Though laborious to set up, once done the apparatus could solve in minutes equations that would take several days by hand. A dozen copies of the machine were built in the U.S. and England.

One of these copies made its way to the U.S. Army’s Aberdeen Proving Ground in Maryland, the facility responsible for readying field weapons for deployment. To aim artillery at a target of known range, soldiers had to set the vertical and horizontal angles (the elevation and azimuth) of the barrel so that the fired shell would follow the desired parabolic trajectory—soaring skyward before dropping onto the target. They selected the angles out of a firing table that contained numerous entries for various target distances and geographic conditions.

Every entry in the firing table required the integration of an ordinary differential equation. An on-site team of 200 computers would take two to three days to do each calculation by hand. The Differential Analyzer, in contrast, would need only about 20 minutes.

Everything is Change

On December 7, 1941, Japanese forces attacked the U.S. Naval base at Pearl Harbor. The U.S. was at war. Mobilization meant the army needed ever more firing tables, each of which contained about 3,000 entries. Even with the Differential Analyzer, the backlog of calculations at Aberdeen was mounting.

Eighty miles up the road from Aberdeen, the Moore School of Electrical Engineering at the University of Pennsylvania had its own differential analyzer. In the spring of 1942 a 35-year-old instructor at the school named John W. Mauchly had an idea for how to speed up calculations: construct an “electronic computor” [ sic ] that would use vacuum tubes in place of the mechanical components. Mauchly, a bespectacled, theoretically-minded individual, probably would not have been able to build the machine on his own. But he found his complement in an energetic young researcher at the school named J. Presper (“Pres”) Eckert, who had already showed sparks of engineering genius.

A year after Mauchly made his original proposal, following various accidental and bureaucratic delays, it found its way to Lieutenant Herman Goldstine, a 30-year-old Ph.D. in mathematics from the University of Chicago who was the technical liaison officer between Aberdeen and the Moore School. Within days Goldstine got the go-ahead for the project. Construction of the ENIAC—for Electronic Numerical Integrator and Computer—began on April 9, 1943. It was Eckert’s 23rd birthday.

Many engineers had serious doubts about whether the ENIAC would ever be successful. Conventional wisdom held that the life of a vacuum tube was about 3,000 hours, and the ENIAC’s initial design called for 5,000 tubes. At that failure rate, the machine would not function for more than a few minutes before a broken tube put it out of action. Eckert, however, understood that the tubes tended to fail under the stress of being turned on or off. He knew it was for that reason radio stations never turned off their transmission tubes. If tubes were operated significantly below their rated voltage, they would last longer still. (The total number of tubes would grow to 18,000 by the time the machine was complete.)

Eckert and his team completed the ENIAC in two and a half years. The finished machine was an engineering tour de force, a 30-ton behemoth that consumed 150 kilowatts of power. The machine could perform 5,000 additions per second and compute a trajectory in less time than a shell took to reach its real-life target. It was also a prime example of the role that serendipity often plays in invention: although the Moore School was not then a leading computing research facility, it happened to be in the right location at the right time with the right people.

Yet the ENIAC was finished in 1945, too late to help in the war effort. It was also limited in its capabilities. It could store only up to 20 numbers at a time. Programming the machine took days and required manipulating a patchwork of cables that resembled the inside of a busy telephone exchange. Moreover, the ENIAC was designed to solve ordinary differential equations. Some challenges—notably, the calculations required for the Manhattan Project—required the solution of partial differential equations.

John von Neumann was a consultant to the Manhattan Project when he learned of the ENIAC on a visit to Aberdeen in the summer of 1944. Born in 1903 into a wealthy Hungarian banking family, von Neumann was a mathematical prodigy who tore through his education. By 23 he had become the youngest ever privatdocent (the approximate equivalent of an associate professor) at the University of Berlin. In 1930 he emigrated to the U.S., where he joined Albert Einstein and Kurt Gödel as one of first faculty members of the Institute for Advanced Study in Princeton, N.J. He became a naturalized U.S. citizen in 1937.

Von Neumann quickly recognized the power of the machine’s computation, and in the several months after his visit to Aberdeen, he joined in meetings with Eckert, Mauchly, Goldstine and Arthur Burks—another Moore School instructor—to hammer out the design of a successor machine, the Electronic Discrete Variable Automatic Computer, or EDVAC.

The EDVAC was a huge improvement over the ENIAC. Von Neumann introduced the ideas and nomenclature of Warren McCulloch and Walter Pitts, neuroscientists who had developed a theory of the logical operations of the brain (this is where we get the term computer “memory”). He thought of the machine as being made of five core parts: Memory held not just numerical data but also the instructions for operation. An arithmetic unit performed arithmetic options. An input “organ” enabled the transfer of programs and data into memory, and an output organ recorded the results of computation. Finally, a control unit coordinated the entire system.

This layout, or architecture, makes it possible to change the computer’s program without altering the physical structure of the machine. Programs were held in memory and could be modified in a trice. Moreover, a program could manipulate its own instructions. This feature would not only enable von Neumann to solve his partial differential equations, it would confer a powerful flexibility that forms the very heart of modern computer science.

In June 1945 von Neumann wrote his classic First Draft of a Report on the EDVAC on behalf of the group. In spite of its unfinished status, it was rapidly circulated among the computing cognoscenti with two consequences. First, there never was a second draft. Second, von Neumann ended up with most of the credit for the invention.

Machine Evolution

The subsequent 60-year diffusion of the computer within society is a long story that has to be told in another place. Perhaps the single most remarkable development was that the computer—originally designed for mathematical calculations—turned out, with the right software, to be infinitely adaptable to different uses, from business data processing to personal computing to the construction of a global information network.

We can think of computer development as having taken place along three vectors—hardware, software and architecture. The improvements in hardware over the past 50 years are legendary. Bulky electronic tubes gave way in the late 1950s to “discrete” transistors—that is, single transistors individually soldered into place. In the mid-1960s microcircuits connected several transistors—then hundreds of transistors, then thousands of transistors—on a silicon “chip.” The microprocessor, developed in the early 1970s, held a complete computer processing unit on a chip. The microprocessor gave rise to the PC and now controls devices ranging from sprinkler systems to ballistic missiles.

The challenges of software were more subtle. In 1947 and 1948 von Neumann and Goldstine produced a series of reports called Planning and Coding of Problems for an Electronic Computing Instrument . In these reports they set down dozens of routines for mathematical computation with the expectation that some lowly “coder” would be able to effortlessly convert them into working programs. It was not to be. The process of writing programs and getting them to work was excruciatingly difficult. The first to make this discovery was Maurice Wilkes, the University of Cambridge computer scientist who had created the first practical stored-program computer. In his Memoirs, Wilkes ruefully recalled the very moment in 1949 when “the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding the errors in my own programs.”

He and others at Cambridge developed a method of writing computer instructions in a symbolic form that made the whole job easier and less error prone. The computer would take this symbolic language and then convert it into binary. IBM introduced the programming language Fortran in 1957, which greatly simplified the writing of scientific and mathematical programs. At Dartmouth College in 1964, educator John G. Kemeny and computer scientist Thomas E. Kurtz invented Basic, a simple but mighty programming language intended to democratize computing and bring it to the entire undergraduate population. With Basic even schoolkids—the young Bill Gates among them—could begin to write their own programs.

In contrast, computer architecture—that is, the logical arrangement of subsystems that make up a computer—has barely evolved. Nearly every machine in use today shares its basic architecture with the stored program computer of 1945. The situation mirrors that of the gasoline-powered automobile—the years have seen many technical refinements and efficiency improvements in both, but the basic design is largely the same. And although it is certainly possible to design a radically better device, both have achieved what historians of technology call “closure.” Investments over the decades have produced such excellent gains that no one has had a compelling reason to invest in an alternative.

Yet there are multiple possibilities for radical evolution. For example, in the 1980s interest ran high in so-called massively parallel machines, which contained thousands of computing elements operating simultaneously, designed for computationally intensive tasks such as weather forecasting and atomic weapons research. Computer scientists have also looked to the human brain for inspiration. We now know that the brain is not a general-purpose computer made from gray matter. Rather it contains specialized processing centers for different tasks, such as face recognition or speech understanding. Scientists are harnessing these ideas in “neural networks” for applications such as automobile license plate identification and iris recognition. They could be the next step in a centuries-old process: embedding the powers of the mind in the guts of a machine.

*Erratum (10/15/09): This sentence has been edited since posting to correct a factual error.

History Of Computers With Timeline [2023 Update]

History Of Computers And Computer Science

It’s important to know the history of computers in order to have a good understanding of the field. Computers are one of the most important inventions in human history. Given how fast technology is evolving, you might not expect the history of computers to go back thousands of years. However, that’s exactly the case. But before we go back that far, let’s first understand what a computer actually is.

The First Computers In History

What is a computer.

A computer is simply a machine that follows a set of instructions in order to execute sequences of logical or arithmetic functions. However, when we think of modern computers , we don’t see them as just calculators performing functions. Yet, that’s exactly what they are at their core.

Every time you make a purchase on Amazon or post a picture on Instagram, your computer is executing instructions and processes a massive amount of binary. However, when we consider the definition of a computer, we realize that the history of computers goes far back.

When Was The First Computer Invented?

The history of computers goes back thousands of years with the first one being the abacus . In fact, the earliest abacus, referred to as the Sumerian abacus, dates back to roughly 2700 B.C. from the Mesopotamia region. However, Charles Babbage, the English mathematician and inventor is known as the “Father of Computers.” He created a steam-powered computer known as the Analytical Engine in 1837 which kickstarted computer history.

Digital Vs. Analog Computers

The very first computer, the abacus, is a digital computer because it deals in digits. Today’s computers are also digital because they compute everything using binary: 0’s and 1’s. However, most of the computers between the time of the abacus and modern transistor-based computers were in fact analog computers.

Analog computers, rather than calculating single digits, deal with more complex mathematics and functions. Rather than 1’s and 0’s, analog computers are more often represented by continuously varying quantities. The earliest analog computer, the Antikythera mechanism , is over 2000 years old. These ancient computers paved the way for modern transistor-based computers.

Brief History of Computers

The history of computers goes back as far as 2500 B.C. with the abacus. However, the modern history of computers begins with the Analytical Engine, a steam-powered computer designed in 1837 by English mathematician and “Father of Computers,” Charles Babbage. Yet, the invention of the transistor in 1947, the integrated circuit in 1958, and the microprocessor in 1971 are what made computers much smaller and faster.

In fact, the first personal computer was invented in 1971, the same year as the microprocessor. Then, the first laptop, the Osborne-1 was created a decade later in 1981. Apple and IBM joined the personal computer industry shortly thereafter, popularizing the home PC. Then, when the world wide web came online in 1989, which would eventually serve to connect nearly the whole world.

The 1990s was a booming decade for computer history. IBM produced the first smartphone in 1992 and the first smartwatch was released in 1998. Also, the first-ever quantum computer in history was up and functioning in 1998, if only for a few nanoseconds.

Turn of the Century Computers

The 2000s are the years of social media: the rise and fall of MySpace at the forefront. Facebook took off shortly after and would become one of the popular apps on the iPhone, which was first presented by the legend, Steve Jobs in 2007. It was a pocket-sized computer that was capable of greater computation than the computer which brought mankind to the Moon. The iPad would be released three years later in 2010.

The 2010s seem to have been the decade of Artificial Intelligence and Quantum Computing. Tesla AI-powered self-driving vehicles have made incredible progress toward full autonomy. An AI robot named Sophia was created in 2016 and even gained citizenship in Saudi Arabia in 2017. The world’s first reprogrammable quantum computer was created in 2016, bringing us closer to quantum supremacy.

Timeline Of Computer History

The first digital computer.

2700 B.C: The first digital computer, the Abacus is invented and used around the area of Mesopotamia. Yet, later iterations of the abacus appear in Egypt, Greece, and China, where they’re continually used for hundreds of years. The first abaci were likely used for addition and subtraction which must have been revolutionary for the time. However, the following iterations allowed for more complex calculations.

The First Analog Computer

200 B.C: The first analog computer, the Antikythera mechanism, is created. The Antikythera mechanism was found off the coast of the Greek island of Kythira from which the computer received its name. This find actually baffled most scientists because a computer this advanced wasn’t supposed to exist this long ago. This mechanical analog computer was used by ancient sailors to determine their position in the sea, based on their astrological position.

Binary Number System

1703: Gottfried Wilhelm Leibniz developed the binary number system which is at the heart of modern computing. The binary number system is a way to convert a series of 0’s and 1’s into other numbers, letters, and characters. Everything we see on screen and interact with on our computers is converted into binary before the computer can process it. The magic of present-day computers is that they process binary extremely quickly.

First Programmable Loom

1801: Joseph Jacquard creates a punch-card programmable loom which greatly simplified the weaving process. This allowed those with fewer skills to weave more complicated patterns. However, many didn’t like the idea of simplifying and automating the process as it would displace weaving jobs at the time. Yet, technology persisted and the textile industry would eventually change for the better because of it.

First Steam-Driven Computer

1837: Charles Babbage designed the groundbreaking Analytical Engine . The analytical engine was the first major step toward modern computers. Although it was never actually built, its design embodied the major characteristics of modern computers. This included memory, a central processing unit, and the ability for input and output. Charles Babbage is commonly referred to as the “Father of Computers” for his work.

First Computer Algorithm

1843: Ada Lovelace , the daughter of Lord Byron, worked alongside Charles Babbage to design the analytical engine. However, shortly afterward, she developed the first-ever computer algorithm. She carefully considered what computers were capable of when developing her algorithm. The result was a solution to Bernoulli numbers , a significant mathematical advancement.

First U.S. Census Calculator

1890: Herman Hollerith created a tabulating machine to help calculate the U.S. census. The previous decade’s census took eight years to calculate but with the help of Hollerith’s tabulating machine, it took only six years. With the success of his tabulator, Hollerith then began his own company, the Hollerith Electrical Tabulating System. He applied this same technology to the areas of accounting and inventory.

The Turing Machine

1936: Alan Turing invented the Turing Machine and pushed the limits of what a computer could do at the time. A Turing Machine consists of a tape divided by squares that can contain a single digit, often binary digits, or nothing at all. It also consisted of a machine that could read each digit on the tape and change it. This might not sound like much, but computers to this day emulate this functionality of reading simple binary input and computing a logical output. This relatively simple machine enables the computation of any algorithm.

Turing set the standard for computers regarding them as “ Turing complete ” if they met the standards for simulating a Turing machine. Today’s computers are Turing complete because they simulate the same functionality of Turing machines, however with a much greater processing ability.

The Complex Number Calculator

1940: George Stibitz created the Complex Number Calculator for Bell Labs. It consisted of relays that could recognize the difference between ‘0’ and ‘1’ and therefore, could use binary as the base number system. The final version of the Complex Number Calculator used more than 400 relays and took about two years to create.

First Automatic Computer

1941: Konrad Zuse, a German Computer Scientist, invented the Z3 computer . Zuse’s Z3 was the first programmable fully automatic computer in history. It was much larger than the Complex number calculator and contained more than 2,500 relays. Since the Z3 computer didn’t demonstrate any advantage to the Germans during world war II, the government didn’t provide any funding for it and it was eventually destroyed in the war.

First Electric Digital Computer

1942: Professor John Vincent Atanasoff invented the Atanasoff-Berry Computer (ABC). The ABC was the first automatic electric digital computer in history. It contained over 300 vacuum tubes and solved linear equations but it was not programmable or Turing complete. However, the Atanasoff-Berry Computer will forever hold a place in Computer history.

First Programmable Electronic Digital Computer

1944: British engineer Tommy Flowers and assistants completed the code-breaking Colossus which assisted in decrypting German messages during world war II. It’s held as the first programmable electronic digital computer in history. The Colossus contained more than 1,600 vacuum tubes and thermionic valves in the prototype and 2,400 in the second version, the Mark 2 Colossus.

First General-Purpose Digital Computer

1945: ENIAC (Electronic Numerical Integrator and Computer) is completed by professors John Mauchly and J. Presper Eckert. ENIAC was absolutely massive, consisting of more than 17,000 vacuum tubes, 70,000 resistors, and 10,000 capacitors, filling a 30′ x 50′ room and weighing around 60,000 pounds. It was the first general-purpose digital computer in history and was extremely capable of a computer at the time. It’s said that for the first decade that ENIAC was in operation, it completed more calculations than in all of history previously.

First Computer Transistor

1947: William Shockley of Bell Labs invented the first transistor and drastically changed the course of computing history. The transistor replaced the common vacuum tube which allowed computers to be much more efficient while still greatly reducing their size and energy requirements.

First General-Purpose Commercial Computer

1951: Professors John Mauchly and J. Presper Eckert built UNIVAC (Universal Automatic Computer), the first general-purpose commercial computer in history. The early UNIVAC models utilized 5,000 vacuum tubes but later models in the series adopted transistors. It was a massive computer weighing around 16,000 pounds. However, the massive size allowed for more than 1,000 computations per second.

First Computer Programming Language

1954: A team at IBM led by John Backus created the first commercially available general-purpose computer programming language, FORTRAN. FORTRAN stands for Formula Translation and is still used today. When the language first appeared, however, there were bugs and inefficiencies which led people to speculate on the commercial usability of FORTRAN. Yet, the bugs were worked out many of the programming languages that came after were inspired by FORTRAN.

First Computer Operating System

1956: The first computer operating system in history was released in 1956 and produced by General Motors, called the GM-NAA I/O . It was created by Robert L. Patrick and allowed for direct input and output, hence the name. It also allowed for batch processing: the ability to execute a new program automatically after the current one finishes.

First Integrated Circuit

1958: Jack Kilby and Robert Noyce create the first integrated circuit , commonly known as a microchip. An integrated circuit consists of electronic circuits mounted onto a semiconductor. The most common semiconductor medium is silicon, which is where the name ‘ Silicon Valley ‘ comes from. If not for the integrated circuit, computers would still be the size of a refrigerator, rather than the size of a credit card.

First Supercomputer

1964: History’s first supercomputer, known as the CDC 6600 , was developed by Control Data Corp. It consisted of 400,000 transistors, 100 miles of wiring, and used Freon for internal cooling. Thus, the CDC 6600 was able to reach a processing speed of up to 3 million floating-point operations per second (3 megaFLOPS). Amazingly, this supercomputer was ten times faster than the fastest computer at the time and cost a whopping $8 million.

First Computer Mouse

1964: Douglas Engelbart invented the first computer mouse in history but it wouldn’t accompany the first Apple Macintosh until 1984. The computer mouse allowed for additional control of the computer in conjunction with the keyboard. These two input devices have been the primary source of user input ever since. However, voice commands from present-day smart devices are increasingly becoming the norm.

First Wide Area Computer Network

1969: DARPA created the first Wide Area Network in the history of computers called ARPAnet which was a precursor to the internet . It allowed computers to connect to a central hub and interact in nearly real time. The term “internet” wouldn’t come around until 1973 when computers in Norway and England connect to ARPAnet. Although the internet has continued to advance through the decades, many of the same protocols from ARPAnet are still standards today.

First Personal Computer

1971: The first personal computer in history, the Kenbak-1 , is created by John Blankenbaker, and sold for only $750. However, only around 40 of these computers were ever sold. As small as it was, it was able to execute hundreds of calculations in a single second. Blankenbaker had the idea for the personal computer for more than two decades before completing his first one.

First Computer Microprocessor

1971: Intel releases the first microprocessor in the history of computers, the Intel 4004 . This tiny microprocessor had the same computing power as the ENIAC computer and was the size of an entire room. Even by today’s standards, the Intel 4004 is a small microprocessor, housed on a 2-inch wafer as opposed to today’s 12-inch wafers. That said, the initial model had only 2,300 transistors while it’s not uncommon for today’s microprocessors to have several hundred million transistors.

First Apple Computer

1976: Apple takes the stage and releases its first computer: the Apple-1 . The Apple-1 was different from other computers at the time. It came fully assembled and on a single motherboard. It sold for nearly $700 and had only 4 KB of memory, which is almost laughable compared to today’s standards. However, that was plenty of memory for the applications at the time.

First IBM Personal Computer

1981: IBM launches its first personal computer, the IBM Model-5150 . It only took a year to develop and cost $1,600. However, that was a steep drop from other IBM computers before this that sold for several million dollars. The IBM Model-5150 had only 16 KB of RAM when it was first released, but eventually increased to up to 640 KB maximum RAM.

First Laptop Computer

1981: The first laptop in the history of computers, the Osborne 1 , was released by the Osborne Computer Corporation. It had an incredibly small 5-inch display screen, a bulky fold-out keyboard, 64 KB of main memory, and weighed 24 pounds. Not surprisingly, the Osborne 1 was actually very popular, selling more than 125,000 units in 1982 alone. The going rate for an Osborne 1 was $1,795.

First Windows Operating System

1985: Microsoft released its first version of the Windows operating system, Windows 1.0 . What made Windows 1.0 remarkable was its reliance on the computer mouse which wasn’t standard yet. It even included a game, Reversi, to help users become accustomed to the new input device. Love it or hate it, the Windows 1.0 operating system and its subsequent versions have become commonplace among computers ever since its creation. The development of the original Windows OS was led by none other than Bill Gates himself.

World Wide Web Is Created

1989: The World Wide Web is created by Sir Tim Berners-Lee of CERN . When it was first created, it wasn’t intended to grow into a massive platform that would connect the average person. Rather, it was originally just intended to easily share information between scientists and universities. The first website in the history of computers was actually just a guide to using the world wide web.

First Flash-Based Solid State Drive

1991: The first flash-based solid-state drive was created by SanDisk (at the time it was called SunDisk). These drives presented an alternative option to hard drives and would prove to be very useful in computers, cell phones, and similar devices. This first flash-based SSD had 20 MB of memory and sold for approximately $1,000.

First Smartphone Is Created

1992: IBM created the first-ever smartphone in history, the IBM Simon , which was released two years later in 1994. It was a far cry from the smartphones we’re used to today. However, at the time, IBM Simon was a game-changer. It sold for $1,100 when it was first released and even had a touchscreen and several applications including mail, a calendar, a to-do list, and a few more.

First Platform Independent Language

1995: Sun Microsystems releases the first iteration of the Java programming language . Java was the first computer programming language in history to be platform-independent, popularizing the phrase: “Write once, run anywhere.” Unlike other computer programming languages at the time, a program written with Java could run on any device with the Java Development Kit (JDK).

First Smartwatch Is Released

1998: The first-ever smartwatch , the Ruputer, was released by the watch company Seiko. If you look at the original Ruputer, you’ll see that it really doesn’t look much different than present-day smartwatches with the exception of a better display and minor styling changes. As it wasn’t a touchscreen, a small joystick assisted with navigating the various feature of the watch.

First Quantum Computer

1998: After decades of theory, the first quantum computer is created by three computer scientists. It was only 2 qubits, as opposed to the 16 qubit reprogrammable quantum computers of recent. This first quantum computer didn’t solve any significant problem, as it wasn’t incredibly efficient. In fact, it ran for only a few nanoseconds. However, it was a proof of concept that paved the way for today’s quantum computers.

First USB Flash Drive

2000: The first USB Flash drive in computer history, the ThumbDrive , is released by Trek, a company out of Singapore. However, there were other flash drives that hit the market almost immediately after, such as I.B.M.’s DiskOnKey, a 1.44 MB flash drive. This led to some speculation as to who was actually first. However, as evidenced by the patent application back in 1999, and the fact that Trek’s ThumbDrive made it to market first, the debate was shortly settled.

DARPA Centibots Project

2002: DARPA launched the Centibots project in which they developed 100 identical robots that could work together and communicate with each other. The Centibots could survey an area and build a map of it in real time. Additionally, these robots could identify objects including their companion robots and people, and distinguish between the two. The maps that they make are incredibly accurate. In total, the Centibots project cost around $2.2 million to complete.

MySpace Comes And Goes

2004: MySpace gained over 1 million users within the first month of its official launch and 22 million users just a year later. Soon after in 2005, it was purchased by News Corp for $580 million. However, shortly after the sale of MySpace, it was fraught with scandal after scandal. MySpace helped to popularize social media, with Facebook trailing right behind it, passing MySpace in users in 2008. It eventually laid off about 50% of its workforce in 2011.

Arduino Is Released

2005: Italian designer Massimo Banzi released the Arduino , a credit card-sized development board. The Arduino was intended to help design students who didn’t have any previous exposure to programming and electronics but eventually became a beloved tool for tech hobbyists worldwide. To this day, Arduino boards are increasingly a part of electronics education including self-education.

iPhone Generation-1 Released

2007: Steve Jobs of Apple released the first-ever iPhone , revolutionizing the smartphone industry. The screen was 50% bigger than the popular smartphones of the time, such as the beloved Blackberry and Treo. It also had a much longer-lasting battery. Additionally, the iPhone normalized web browsing and video playback on phones, setting a new standard across the industry. The cost was what you could expect from an iPhone, selling at around $600, more than twice as much as its competitors.

Apple’s iPad Is Released

2010: Only three years after the iPhone is released, Steve Jobs announces the first-ever iPad , Apple’s first tablet computer. It came with a 9.7-inch touchscreen and options for either 16GB, 32GB, or 64GB. The beauty of the iPad is that it was basically a large iPhone, as it ran on the same iOS and offered the same functionality. The original iPad started at $499 with the 64GB Wi-Fi + 3G version selling for $829.

Nest Thermostat Is Released

2011: The Nest Thermostat , a smart thermostat created by Nest Labs, is released as a growing number of household devices make up the “internet of things.” When the Nest Thermostat was first released, not only did it make thermostats smart, it made them beautiful. For only $250 you could buy a thermostat that decreased your energy bill, improved the aesthetic of your home, and is controlled by your phone.

First Raspberry Pi Computer

2012: The first Raspberry Pi computer is released, opening up a world of possibilities for creative coders. These small yet capable computers cost around 25$-$35 when first released and were as small as a credit card. Raspberry Pi’s were similar to the Arduino in size but differed greatly in their capability. The Raspberry Pi is several times faster than the Arduino and has over 100,000 times more memory.

Tesla Introduces Autopilot

2014: Tesla’s Elon Musk introduces the first self-driving features in its fleet of automobiles dubbed: Autopilot. The future of automobiles chauffeuring their passengers with no input from a driver is finally within sight. The first feature of autopilot included not only camera systems but also radar and sonar in order to detect everything within the car’s surroundings. It also included a self-park feature and even a summoning feature that calls the vehicle to you. The computers and technology within Tesla vehicles have essentially turned them into the first advanced personal transportation robots in history.

Sophia The Robot Is Created

2016: Sophia, the artificially intelligent humanoid robot, was created by former Disney Imagineer David Hanson. A year after her creation, Sophia gained citizenship in Saudi Arabia, becoming the first robot in history to gain citizenship. Since she was created, Sophia was taken part in many interviews and even debates. She’s quite a wonder to watch!

First Reprogrammable Quantum Computer

2016: Quantum Computers have made considerable progress and the first reprogrammable quantum computer is finally complete. It’s made up of 5 singular atoms that act as switches. These switches are activated by a laser beam that controls the state of the qubit. This leap has brought us very close to quantum supremacy.

First Brain-Computer Interface

2019: Elon Musk announces Neuralink’s progress of their brain-machine interface that would lend humans the same information processing abilities that computers have while linking to Artificial Intelligence. In this announcement, Neuralink revealed that they had already successfully tested their technology on mice and apes.

Tesla Nears Fully Autonomous Vehicles

2020: In July, Elon Musk declared that a Tesla autopilot update is coming later this year that will bring their vehicles one step closer to complete “level-5” autonomy . Level-5 autonomy would finally allow passengers to reach their destination without any human intervention. The long-awaited software update would likely increase the company’s value massively and Musk’s net worth along with it.

Elon Musk announces Tesla Bot, becomes Time’s Person of the Year

2021: Musk continues to innovate, announcing in August that Tesla is developing a near-life-size humanoid robot. Many are skeptical of the viability of the robot while others claimed this is another of Musk’s inventions that science fiction warned against similar to the Brain-Computer Interface.

Regardless of any opinions, Musk still had a stellar year. Starship has made progress, Tesla sales are on the rise, and Musk managed to earn the title of Time’s Person of the Year. All the while becoming the most wealthy person on the planet, with a net worth exceeding $250 million.

Facebook changes name to Meta, Zuck announces Metaverse

2021: In October, Mark Zuckerberg made a bold and controversial, yet possibly visionary announcement that Facebook would change its name to Meta. Additionally, he explained the new immersive Virtual Reality world they’re creating that would be built on top of the existing social network, dubbed the Metaverse. Zuckerberg elaborated that the technology for the immersive experience he envisions is mostly here but mainstream adoption is still 5 to 10 years out . However, when that time comes, your imagination will be your only limitation within the confines of the Metaverse.

IBM’s “Eagle” Quantum Computer Chip (127 Qubits)

2021: IBM continues to lead the charge in quantum computer development and in November, they showcased their new “Eagle” chip . This is currently the most cutting-edge quantum chip in existence, packing 127 qubits, making it the first to reach over 100 qubits. IBM plans to create a new chip more than three times more powerful than the “Eagle” by next year, 2022.

OpenAI Releases DALL-E 2

2022: DALL-E, developed by OpenAI, is capable of generating high-quality images from textual descriptions. It uses a combination of deep learning techniques and a large database of images to generate new and unique images based on textual input. DALL-E 2, launched in April 2022, is the second generation of this language model that is trained on hundreds of millions of images and is almost magical in its production of high-quality images in a matter of seconds.

IBM’s “Osprey” Quantum Computer Chip (433 Qubits)

2022: In only a year, IBM has nearly tripled the quantum capacity of its previous “Eagle” chip. The new “Osprey” quantum chip, announced in November, greatly surpasses its predecessor. The IBM Osprey quantum computer chip represents a major advancement in quantum computing technology and is expected to pave the way for even more powerful quantum computers in the future.

ChatGPT Released Upon The World

2022: ChatGPT is launched on November 30 and took the world by storm, amassing over 1 million users in only 5 days! Currently, ChatGPT is powered by GPT3.x, which is the latest version of the AI software. The development of ChatGPT was a significant milestone in the field of natural language processing, as it represented a significant improvement in the ability of machines to understand and generate human language.

The Much Hyped GPT4 Is Finally Released

2023: ChatGPT and many other AI apps run on GPT. As powerful as the GPT3.x was, GPT4 is trained on a much more massive data set and is far more accurate and better at understanding the intentions of the user’s prompts. It’s a giant leap forward from the previous version, causing both excitement and concern from the general public as well as tech powerhouses such as Elon Musk who wants to slow the advancement of AI.

Tim Statler

Tim Statler is a Computer Science student at Governors State University and the creator of Comp Sci Central. He lives in Crete, IL with his wife, Stefanie, and their cats, Beyoncé and Monte. When he's not studying or writing for Comp Sci Central, he's probably just hanging out or making some delicious food.

Recent Posts

Programming Language Levels (Lowest to Highest)

When learning to code, one of the first things I was curious about was the difference in programming language levels. I recently did a deep dive into these different levels and put together this...

Is Python a High-Level Language?

Python is my favorite programming language so I wanted to know, "Is Python a High-Level Language?" I did a little bit of research to find out for myself and here is what I learned. Is Python a...

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

  • All Categories
  • Artificial Intelligence Software

History Of Computers: Timeline, I/O Devices and Networking

research the history of computer

In this post

Computers in the 1600s

Computers in the 1800s, computers from the 1900-1950s, computers from the 1960-1970s, computers from the 1980-1990s, computers from 2000-2010, computers from 2011 - present day.

Can you imagine your life without a computer?

Think about all of the things you wouldn’t be able to do. Send an email or online shop, and find an answer to a question instantly. And that’s just the tip of the iceberg.

We’ve come a long way from the very first computer and even the first smartphone. But how much do you really know about their history and evolution? From floppy discs to artificial intelligence (AI) software , the Acorn to the Macintosh, let’s explore how far we’ve come.

While today we use computers for both work and play, they were actually created for an entirely different purpose.

In 1880, the population of the United States had grown so large that it took seven years to formulate the results of the U.S. Census. So, the government looked for a faster way to get the job done, which is why punch-card computers were invented, which took up an entire room. The first digital computer, known as the Pascal Machine, has its roots in the 16th century. The Pascal Machine was the first revelation in the world of computers before Charles Babbage invented the Difference engine. From assembly to binary, we have come a long way.

While that’s how the story starts, it’s certainly not where it ends. Let’s explore the history of computers.

History of Computers 18000-1970s

In the 1600s, the idea of an analog computer became a little known. A computer was referred to a device that is used for mathematical and statistical calculations. It used the techniques of decimal, statistics and logarithmic expressions to compute values. Women were tasked with such calculations and computations. 

1617:  John Napier invented Napier Bones, a manual calculating apparatus. Napier bones was a handheld device that used to support multiplication and division. It was also known as Napier Rods.

1642: Biase Pascal, a french mathematician invented Pascaline, which is also known as the world's first automated calculator. 

1673: Gottfried Willhelm Liebniz invented Stepped Reckoner or Liebniz wheel, which was an improved version of Pascaline. This mechanical calculator could do basic calculations and was made of flat drums. 

Want to learn more about Artificial Intelligence Software? Explore Artificial Intelligence products.

1801 : In France, weaver and merchant Joseph Marie Jacquard create a loom that uses wooden punch cards to automate the design of woven fabrics. Early computers would use similar punch cards.

1822 : Thanks to funding from the English government, mathematician Charles Babbage invented a steam-driven calculating machine that was able to compute tables of numbers.

1822:  Charles Babbage invented the difference engine as a machine that could perform any kind of calculation, from basic mathematical to astronomical calculations. 

1840:  Lady Ada Lovelace wrote her first computer programme which was a sequence of user-generated instructions for the computer.

1890 : Inventor Herman Hollerith designs the punch card system to calculate the 1880 U.S. census. It took him three years to create, and it saved the government $5 million. He would eventually go on to establish a company that would become IBM .

Hollerith Machine

A census clerk tabulates data using the Hollerith machine Source: Census.gov

1936 : Alan Turing developed an idea for a universal machine, which he would call the Turing machine, that would be able to compute anything that is computable. The concept of modern computers was based on his idea.

1937 : A professor of physics and mathematics at Iowa State University, J.V. Atanasoff, attempts to build the first computer without cams, belts, gears, or shafts.

1939 : Bill Hewlett and David Packard found Hewlett-Packard in a garage in Palo Alto, California. Their first project, the HP 200A Audio Oscillator, would rapidly become a popular piece of test equipment for engineers.

In fact, Walt Disney Pictures would order eight to test recording equipment and speaker systems for 12 specially equipped theaters that showed Fantasia in 1940.

David Packard and Bill Hewlett

David Packard and Bill Hewlett in 1964 Source : PA Daily Post

Also in 1939, Bell Telephone Laboratories completed The Complex Number Calculator, designed by George Stibitz.

1941 : Professor of physics and mathematics at Iowa State University J.V. Atanasoff and graduate student Clifford Berry designed a computer that can solve 29 equations simultaneously. This is the first time a computer is able to house data within its own memory.

That same year, German engineer Konrad Zuse created the Z3 computer, which used 2,300 relays, performed floating-point binary arithmetic, and had a 22-bit word length. This computer was eventually destroyed in a bombing raid in Berlin in 1943.

Additionally, in 1941, Alan Turing and Harold Keen built the British Bombe, which decrypted Nazi ENIGMA-based military communications during World War II.

1943 : John Mauchly and J. Presper Eckert, professors at the University of Pennsylvania, built an Electronic Numerical Integrator and Calculator (ENIAC) . This is considered to be the grandfather of digital computers, as it is made up of 18,000 vacuum tubes and fills up a 20-foot by 40-foot room.

1944:  J. Presper Eckert and John Mauchly built the successor of ENIAC, known as Electronic Discrete Variable Automatic Computer which was one of the earliest electronic computer that could process and store discrete data

1951:   The first commercially produced digital computer, known as UNIVAC, was launched as a general purpose commercial computer for the US business economy. 

Tip: A vacuum tube was a device that controlled electronic current flow.

ENIAC Technician

Source: Science Photo Library

Also, in 1943, the U.S. Army asked that Bell Laboratories design a machine to assist in the testing of their M-9 director, which was a type of computer that aims large guns at their targets. George Stibitz recommended a delay-based calculator for the project. This resulted in the Relay Interpolator, which was later known as the Bell Labs Model II.

1944 : British engineer Tommy Flowers designed the Colossus, which was created to break the complex code used by the Nazis in World War II. A total of ten were delivered, each using roughly 2,500 vacuum tubes. These machines would reduce the time it took to break their code from weeks to hours, leading historians to believe they greatly shortened the war by being able to understand the intentions and beliefs of the Nazis.

That same year, Harvard physics professor Howard Aiken built and designed The Harvard Mark 1, a room-sized, relay-based calculator.

1945 : Mathematician John von Neumann writes The First Draft of a Report on the EDVAC . This paper broke down the architecture of a stored-program computer.

1946 : Mauchly and Eckert left the University of Pennsylvania and obtained funding from the Census Bureau to build the Universal Automatic Computer (UNIVAC) . This would become the first commercial computer for business and government use.

That same year, Will F. Jenkins published the science fiction short story A Logic Named Joe , which detailed a world where computers, called Logics, interconnect into a worldwide network. When a Logic malfunctions, it gives out secret information about forbidden topics.

1947 : Walter Brattain, William Shockley, and John Bradeen of Bell Laboratories invented the transistor, which allowed them to discover a way to make an electric switch using solid materials, not vacuums.

1947:  Kathleen Booth invented assembly language that converted high level language (HLL) into machine language. It bridged software functioning with the hardware.  

1948 : Frederick Williams, Geoff Toothill, and Tom Kilburn, researchers at the University of Manchester, developed the Small-Scale Experimental Machine. This was built to test new memory technology, which became the first high-speed electronic random access memory for computers. The became the first program to run on a digital, electronic, stored-program computer.

1950 : Built in Washington, DC, the Standards Eastern Automatic Computer (SEAC) w as created, becoming the first stored-program computer completed in the United States. It was a test bed for evaluating components and systems, in addition to setting computer standards.

1953 : Computer scientist Grace Hopper develops the first computer language, which is eventually known a s Common Business Oriented Language (COBOL), that allowed a computer user to use English-like words instead of numbers to give the computer instructions. In 1997, a study showed that over 200 billion lines of COBOL code were still in existence.

research the history of computer

Source: bricsys

That same year, businessman Thomas Johnson Watson Jr. created the IBM 701 EDPM, which is used to help the United Nations keep tabs on Korea during the war.

1954 : Th e Formula Translation (FORTRAN) program ming language is developed by John Backus and a team of programmers at IBM.

Additionally, IBM created the 650, which was the first mass-produced computer, selling 450 in just one year.

1958 : Jack Kirby and Robert Noyce invented the integrated circuit, which is what we now call the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1962 : IBM announces the 1311 Disk Storage Drive, the first disk drive made with a removable disk pack. Each pack weighed 10 pounds, held six disks, and had a capacity of 2 million characters.

Also in 1962, the Atlas computer made its debut, thanks to Manchester University, Ferranti Computers, and Plessy. At the time, it was the fastest computer in the world and introduced the idea of “virtual memory”.

1963:  The American Standard Code for Information Exchange (ASCII) system was launched as the official character encoding standard for computers. In ASCII, one letter or numeric is represented as an 8-bit code.

1964 : Dou glas Engelbart introduces a prototype for the modern computer that includes a mouse and a graphical user interface (GUI). This begins the evolution from computers being exclusively for scientists and mathematicians to b eing accessible to the general public.

Additionally, IBM introduced SABRE, their reservation system with American Airlines. It program officially launched four years later, and now the company owns Travelocity. It used telephone lines to link 2,000 terminals in 65 cities, delivering data on any flight in under three seconds.

1968 : Stanley Kubrick’s 2001: A Space Odyssey hits theaters. This cult classic tells the story of the HAL 9000 computer as it malfunctions during a spaceship’s trip to Jupiter to investigate a mysterious signal. The HAL 9000, which controlled the ship, went rogue, killed the crew, and had to be shut down by the only surviving crew member. The film depicted computer-demonstrated voice and visual recognition, human-computer interaction, speed synthesis, and other advanced technologies.

1968:  Japanese company OKI, launched dot matrix printers which supported a general character of 128 characters with a print matrix of 7x5

1969 : Developers at Bell Labs unveil UNIX, an operating system written in C programming language that addressed compatibility issues within programs.

UNIX at Bell Labs

Source: Nokia Bell Labs

1970 : Intel introduces the world to the Intel 1103, the fi rst Dynamic Access Memory (DRAM) chip.

1970:  PASCAL, a low-level procedure oriented language was invented by Niklaus Wirth of Switzerland to teach structured programming, that involves loops and conditions. 

1971 : Alan Shugart and a team of IBM engineers invented the floppy disk, allowing data to be shared among computers.

That same year, Xerox introduced the world to the first laser printer, which not only generated billions of dollars but also launched a new era in computer printing.

Also, email begins to grow in popularity as it expands to computer networks.

1973 : Robert Metcalfe, research employee at Xerox, develops Ethernet, connecting multiple computers and hardware.

1974 : Personal computers are officially on the market! The first of the bunch were Scelbi & Mark-8 Altair, IBM 5100, and Radio Shack's TRS-80.

1975 : In January,  Popular Electronics magazine featured the Altair 8800 as the world’s first minicomputer kit. Paul Allen and Bill Gates offer to write software for Altair using the BASIC language. You could say writing software was successful because in the same year, they created their own software company, Microsoft .

1976 : Steve Jobs and Steve Wozniak start Apple Computers and introduced the world to the Apple I, the first computer with a single-circuit board.

first apple computer

Source: MacRumors

Also, in 1976, Queen Elizabeth II sent out her first email from the Royal Signals and Radar Establishment to demonstrate networking technology.

Queen Elizabeth II Sends Emails

Source: Wired

1977 : Jobs and Wozniak unveiled the Apple II at the first West Coast Computer Faire. It boasts color graphics and an audio cassette drive for storage. Millions were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers.

1978 : The first computers were installed in the White House during the Carter administration. The White House staff was given terminals to access the shared Hewlett-Packard HP3000.

Also, the first computerized spreadsheet program, VisiCalc, is introduced.

Additionally, LaserDisc is introduced by MCA and Phillips. The first to be sold in North America was the movie Jaws .

1979 : MicroPro International unveils WordStar, a word processing program.

WordStar certainly wasn’t the last of its kind, as there is a plethora of other document creation software on the market today. Check out some of the highest-rated and leave a review on your favorite!

1981 : Not to be outdone by Apple, IBM releases their first personal computer, the Acorn, with an Intel chip, two floppy disks, and an available color monitor.

IBM Acorn

Source: Florida History Network

1982 : Instead of going with its annual tradition of naming a “Man of the Year”, Time Magazine does something a little different and names the computer its “Machine of the Year”. A senior writer noted in the article, “Computers were once regarded as distant, ominous abstractions, like Big Brother. In 1982, they truly became personalized, brought down to scale, so that people could hold, prod and play with them."

Time Magazine Machine of the Year

Source: Time

1983 : The CD-ROM hit the market, able to hold 550 megabytes of pre-recorded data. That same year, many computer companies worked to set a standard for these disks, making them able to be used freely to access a wide variety of information.

Later that year, Microsoft introduced Word, which was originally called Multi-Tool Word.

1984 : Apple launches Macintosh, which was introduced during a Super Bowl XVIII commercial. The Macintosh was the first successful mouse-driven computer with a graphical user interface. It sold for $2,500.

1985 : Microsoft announces Windows, which allowed for multi-tasking with a graphical user interface.

That same year, a small Massachusetts computer manufacturer registered the first dot com domain name, Symbolics.com.

Also, the programming language C++ is published and is said to make programming “more enjoyable” for the serious programmer.

1986 : Originally called the Special Effects Computer Group, Pixar is created at Lucasfilm. It worked to create computer-animated portions of popular films, like Star Trek II: The Wrath of Khan . Steve Jobs purchased Pixar in 1986 for $10 million, renaming it Pixar. It was bought by Disney in 2006.

1990 : English programmer and physicist Tim Berners-Lee develops HyperText Markup Language, also known as HTML. He also prototyped the term WorldWideWeb. It features a server, HTML, URLs, and the first browser.

1991 : Apple released the Powerbook series of laptops, which included a built-in trackball, internal floppy disk, and palm rests. The line was discontinued in 2006.

1993 : In an attempt to enter the handheld computer market, Apple releases Newton. Called the “Personal Data Assistant”, it never performed the way Apple President John Scully had hoped, and it was discontinued in 1998.

Also that year, Steven Spielberg’s Jurassic Park hits theaters, showcasing cutting-edge computer animation in addition to animatronics and puppetry.

History of Computers 1980 to present

1995 : IBM released the ThinkPad 701C, which was officially known as the Track Write, with an expanding full-sized keyboard that was comprised of three interlocking pieces.

Additionally, the format for a Digital Video Disc (DVD) is introduced, featuring a huge increase in storage space that the compact disc (CD).

Also that year, Microsoft’s Windows 95 operating system was launched. To spread the word, a $300 million promotional campaign was rolled out, featuring TV commercials that used “Start Me Up” by the Rolling Stones and a 30-minute video starring Matthew Perry and Jennifer Aniston. It was installed on more computers than any other operating system.

And in the world of code, Java 1.0 is introduced by Sun Microsystems, followed by JavaScript at Netscape Communications.

1996 : Sergey Brin and Larry Page develop Google at Stanford University.

Sergey Brin and Larry Page

Source: CNBC

That same year, Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, created the personal data assistance called the Palm Pilot.

Also in 1996 was the introduction of the Sony Vaio series. This desktop computer featured an additional 3D interface in addition to the Windows 95 operating system, as a way to attract new users. The line was discontinued in 2014.

1997 : Microsoft invests $150 million into Apple, which ended Apple’s court case against Microsoft, saying they copied the “look and feel” of their operating system.

1998 : Apple releases the iMac, a range of all-in-one Macintosh desktop computers. Selling for $1,300, these computers included a 4GB hard drive, 32MB Ram, a CD-ROM, and a 15-inch monitor.

Apple's iMac Computers

Source: Start Ups Venture Capital

1998:  Isaac Chuang of Los Alamos National Laboratory, Niel Gershenfeld of Massachusetts Institute of Technology (MIT) and Mark Kubinec of University of California launched the first quantum computer (2-qubit) that can be loaded with quantum energy to produce outputs.  

1999 : The term Wi-Fi becomes part of the computing language as users begin connecting without wires. Without missing a beat, Apple creates its “Airport” Wi-Fi router and builds connectivity into Macs.

2000 : In Japan, SoftBank introduced the first camera phone, the J-Phone J-SH04. The camera had a maximum resolution of 0.11 megapixels, a 256-color display, and photos could be shared wirelessly. It was such a hit that a flip-phone version was released just a month later.

Also, in 2000, the USB flash drive is introduced. Used for data storage, they were faster and had a greater amount of storage space than other storage media options. Plus, they couldn’t be scratched like CDs.

2001 : Apple introduces the Mac OS X operating system. Not to be outdone, Microsoft unveiled Windows XP soon after.

Also, the first Apple stores are opened in Tysons Corner, Virginia, and Glendale, California. Apple also released iTunes, which allowed users to record music from CDs, burn it onto the program, and then mix it with other songs to create a custom CD.

2003 : Apple releases the iTunes music store, giving users the ability to purchase songs within the program. In less than a week after its debut, over 1 million songs were downloaded.

Also, in 2003, the Blu-ray optical disc is released as the successor of the DVD.

And, who can forget the popular social networking site Myspace, which was founded in 2003. By 2005, it had more than 100 million users.

2004 : The first challenger of Microsoft’s Internet Explorer came in the form of Mozilla’s Firefox 1.0. That same year, Facebook launched as a social networking site.

Original Facebook Homepage

Source: Business Insider

2005 : YouTube, the popular video-sharing service, is founded by Jawed Karim, Steve Chen, and Chad Hurley. Later that year, Google acquired the mobile phone operating system Android.

2006 : Apple unveiled the MacBook Pro, making it their first Intel-based, dual-core mobile computer.

That same year at the World Economic Forum in Davos, Switzerland, the United Nations Development Program announced they were creating a program to deliver technology and resources to schools in under-developed countries. The project became the One Laptop per Child Consortium, which was founded by Nicholas Negroponte, the founder of MIT’s Media Lab. By 2011, over 2.4 million laptops had been shipped.

And we can’t forget to mention the launch of Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service (S3). EC2 made it possible for users to use the cloud to scale server capacity quickly and efficiently. S3 was a cloud-based file hosting service that charged users monthly for the amount of data they stored.

2007 : Apple released the first iPhone, bringing many computer functions to the palm of our hands. It featured a combination of a web browser, a music player, and a cell phone -- all in one. Users could also download additional functionality in the form of “apps”. The full-touchscreen smartphone allowed for GPS navigation, texting, a built-in calendar, a high-definition camera, and weather reports.

Steve Jobs with Original iPhone

Also, in 2007, Amazon released the Kindle, one of the first electronic reading systems to gain a large following among consumers.

And Dropbox was founded by Arash Ferdowsi and Drew Houston as a way for users to have convenient storage and access to their files on a cloud-based service.

2008 : Apple releases the MacBook Air, the first ultra notebook that was a thin and lightweight laptop with a high-capacity battery. To get it to be a smaller size, Apple replaced the traditional hard drive with a solid-state disk, making it the first mass-marketed computer to do so.

2009 : Microsoft launched Windows 7.

2010 : Apple released the iPad, officially breaking into the dormant tablet computer category. This new gadget came with many features the iPhone had, plus a 9-inch screen and minus the phone.

2011 : Google releases the Chromebook, a laptop that runs on Google Chrome OS.

Also in 2011, the Nest Learning Thermostat emerges as one of the first Internet of Things, allowing for remote access to a user’s home thermostat by use of their smartphone or tablet. It also sent monthly power consumption reports to help customers save on energy bills.

In Apple news, co-founder Steve Jobs passed away on October 11. The brand also announced that the iPhone 4S will feature Siri, a voice-activated personal assistant.

2012 : On October 4, Facebook hits 1 billion users, as well as acquires the image-sharing social networking application Instagram.

Also in 2012, the Raspberry Pi, a credit-card-sized single-board computer, is released, weighing only 45 grams.

2014 : The University of Michigan Micro Mote (M3), the smallest computer in the world, is created. Three types were made available, two of which measured either temperature or pressure and one that could take images.

Additionally, the Apple Pay mobile payment system is introduced.

2015 : Apple releases the Apple Watch, which incorporated Apple’s iOS operating system and sensors for environmental and health monitoring. Almost a million units were sold on the day of its release.

This release was followed closely by Microsoft announcing Windows 10.

2016 : The first reprogrammable quantum computer is created.

2019 : Apple announces iPadOS, the iPad's very own operating system, to support the device better as it becomes more like a computer and less like a mobile device. 

2022: Frontier became the first exascale supercomputer, surpassing one exaFLOP. Developed by HPE and using AMD EPYC CPUs and Radeon Instinct GPUs, it cost $600 million and is housed at OLCF, Tennessee, advancing scientific research.

2023:  Microsoft releases ChatGPT-powered Bing to offer a search generative experience and answer maximum search queries of the users. 

2023:  Open AI launched GPT-4 or ChatGPT Plus on March 15, 2023

2023:  The AI PC was introduced in December 2023 with the inclusion of Intel Core Ultra

So, what’s next?

I don’t have the answer to what awaits us in regard to computers. One thing is for sure -- in order to keep up with the world of tech, the growing need for cyber security and data security , and our constant need for the next big thing, computers aren’t going anywhere. If anything, they’re only going to become a bigger part of our daily lives.

Now that you've learned about the history of computers, it's time to protect your online presence and stay one step ahead in the ever-evolving digital landscape with digital security .

Mara Calvello

Mara Calvello is a Content Marketing Manager at G2. She received her Bachelor of Arts degree from Elmhurst College (now Elmhurst University). Mara works on our G2 Tea newsletter, while also writing content to support categories on artificial intelligence, natural language understanding (NLU), AI code generation, synthetic data, and more. In her spare time, she's out exploring with her rescue dog Zeke or enjoying a good book.

Explore More G2 Articles

Artificial intelligence (AI) software

research the history of computer

The history of computing is both evolution and revolution

research the history of computer

Head, Department of Computing & Information Systems, The University of Melbourne

Disclosure statement

Justin Zobel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Melbourne provides funding as a founding partner of The Conversation AU.

View all partners

This month marks the 60th anniversary of the first computer in an Australian university. The University of Melbourne took possession of the machine from CSIRO and on June 14, 1956, the recommissioned CSIRAC was formally switched on. Six decades on, our series Computing turns 60 looks at how things have changed.

It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.

For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.

Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.

The transformation that is due to computation is often described as digital disruption . But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.

Evolution and revolution

Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.

But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.

For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.

In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.

Getting connected

Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.

research the history of computer

Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.

This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.

As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.

The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.

The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.

This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.

With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.

The ‘superhighway’

It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “ information superhighway ”, before any major commercial or social uses of the internet had appeared.

research the history of computer

Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee , created a system for storing documents and publishing them to the internet, which they called the world wide web .

As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.

As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.

With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.

We all connect

The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.

There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.

This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.

The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.

Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.

Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.

The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.

  • Computer science
  • Computing turns 60

research the history of computer

Director of STEM

research the history of computer

Community member - Training Delivery and Development Committee (Volunteer part-time)

research the history of computer

Chief Executive Officer

research the history of computer

Finance Business Partner

research the history of computer

Head of Evidence to Action

The History of Computers

These Breakthroughs in Mathematics and Science Led to the Computing Age

Clemens Pfeiffer/Wikimedia Commons/CC BY 2.5

  • Famous Inventions
  • Famous Inventors
  • Patents & Trademarks
  • Invention Timelines
  • Computers & The Internet
  • American History
  • African American History
  • African History
  • Ancient History and Culture
  • Asian History
  • European History
  • Latin American History
  • Medieval & Renaissance History
  • Military History
  • The 20th Century
  • Women's History

Before the age of electronics, the closest thing to a computer was the abacus, although, strictly speaking, the abacus is actually a calculator since it requires a human operator. Computers, on the other hand, perform calculations automatically by following a series of built-in commands called software.

In the 20 th century, breakthroughs in technology allowed for the ever-evolving computing machines that we now depend upon so totally, we practically never give them a second thought. But even prior to the advent of microprocessors and supercomputers , there were certain notable scientists and inventors who helped lay the groundwork for the technology that's since drastically reshaped every facet of modern life.

The Language Before the Hardware

The universal language in which computers carry out processor instructions originated in the 17th century in the form of the binary numerical system. Developed by German philosopher and mathematician Gottfried Wilhelm Leibniz , the system came about as a way to represent decimal numbers using only two digits: the number zero and the number one. Leibniz's system was partly inspired by philosophical explanations in the classical Chinese text the “I Ching,” which explained the universe in terms of dualities such as light and darkness and male and female. While there was no practical use for his newly codified system at the time, Leibniz believed that it was possible for a machine to someday make use of these long strings of binary numbers.​

In 1847, English mathematician George Boole introduced a newly devised algebraic language built on Leibniz's work. His “Boolean Algebra” was actually a system of logic, with mathematical equations used to represent statements in logic. Equally important was that it employed a binary approach in which the relationship between different mathematical quantities would be either true or false, 0 or 1. 

As with Leibniz, there were no obvious applications for Boole’s algebra at the time, however, mathematician Charles Sanders Pierce spent decades expanding the system, and in 1886, determined that the calculations could be carried out with electrical switching circuits. As a result, Boolean logic would eventually become instrumental in the design of electronic computers.

The Earliest Processors

English mathematician Charles Babbage is credited with having assembled the first mechanical computers—at least technically speaking. His early 19th-century machines featured a way to input numbers, memory, and a processor, along with a way to output the results. Babbage called his initial attempt to build the world’s first computing machine the “difference engine.” The design called for a machine that calculated values and printed the results automatically onto a table. It was to be hand-cranked and would have weighed four tons. But Babbage's baby was a costly endeavor. More than £17,000 pounds sterling was spent on the difference engine's early development. The project was eventually scrapped after the British government cut off Babbage’s funding in 1842.

This forced Babbage to move on to another idea, an "analytical engine," which was more ambitious in scope than its predecessor and was to be used for general-purpose computing rather than just arithmetic. While he was never able to follow through and build a working device, Babbage’s design featured essentially the same logical structure as electronic computers that would come into use in the 20 th century. The analytical engine had integrated memory—a form of information storage found in all computers—that allows for branching, or the ability for a computer to execute a set of instructions that deviate from the default sequence order, as well as loops, which are sequences of instructions carried out repeatedly in succession. 

Despite his failures to produce a fully functional computing machine, Babbage remained steadfastly undeterred in pursuing his ideas. Between 1847 and 1849, he drew up designs for a new and improved second version of his difference engine. This time, it calculated decimal numbers up to 30 digits long, performed calculations more quickly, and was simplified to require fewer parts. Still, the British government did not feel it was worth their investment. In the end, the most progress Babbage ever made on a prototype was completing one-seventh of his first design.

During this early era of computing, there were a few notable achievements: The tide-predicting machine , invented by Scotch-Irish mathematician, physicist, and engineer Sir William Thomson in 1872, was considered the first modern analog computer. Four years later, his older brother, James Thomson, came up with a concept for a computer that solved mathematical problems known as differential equations. He called his device an “integrating machine” and in later years, it would serve as the foundation for systems known as differential analyzers. In 1927, American scientist Vannevar Bush started development on the first machine to be named as such and published a description of his new invention in a scientific journal in 1931.

Dawn of Modern Computers

Up until the early 20 th century, the evolution of computing was little more than scientists dabbling in the design of machines capable of efficiently performing various kinds of calculations for various purposes. It wasn’t until 1936 that a unified theory on what constitutes a "general-purpose computer" and how it should function was finally put forth. That year, English mathematician Alan Turing published a paper titled, "On Computable Numbers, with an Application to the Entscheidungsproblem," which outlined how a theoretical device called a “Turing machine” could be used to carry out any conceivable mathematical computation by executing instructions. In theory, the machine would have limitless memory, read data, write results, and store a program of instructions.

While Turing’s computer was an abstract concept, it was a German engineer named Konrad Zuse who would go on to build the world’s first programmable computer. His first attempt at developing an electronic computer, the Z1, was a binary-driven calculator that read instructions from punched 35-millimeter film. The technology was unreliable, however, so he followed it up with the Z2, a similar device that used electromechanical relay circuits. While an improvement, it was in assembling his third model that everything came together for Zuse. Unveiled in 1941, the Z3 was faster, more reliable, and better able to perform complicated calculations. The biggest difference in this third incarnation was that the instructions were stored on an external tape, thus allowing it to function as a fully operational program-controlled system. 

What’s perhaps most remarkable is that Zuse did much of his work in isolation. He'd been unaware that the Z3 was "Turing complete," or in other words, capable of solving any computable mathematical problem—at least in theory. Nor did he have any knowledge of similar projects underway around the same time in other parts of the world.

Among the most notable of these was the IBM-funded Harvard Mark I, which debuted in 1944. Even more promising, though, was the development of electronic systems such as Great Britain’s 1943 computing prototype Colossus and the ENIAC , the first fully-operational electronic general-purpose computer that was put into service at the University of Pennsylvania in 1946.

Out of the ENIAC project came the next big leap in computing technology. John Von Neumann, a Hungarian mathematician who'd consulted on ENIAC project, would lay the groundwork for a stored program computer. Up to this point, computers operated on fixed programs and altering their function—for example, from performing calculations to word processing. This required the time-consuming process of having to manually rewire and restructure them. (It took several days to reprogram ENIAC.) Turing had proposed that ideally, having a program stored in the memory would allow the computer to modify itself at a much faster pace. Von Neumann was intrigued by the concept and in 1945 drafted a report that provided in detail a feasible architecture for stored program computing.   

His published paper would be widely circulated among competing teams of researchers working on various computer designs. In 1948, a group in England introduced the Manchester Small-Scale Experimental Machine, the first computer to run a stored program based on the Von Neumann architecture. Nicknamed “Baby,” the Manchester Machine was an experimental computer that served as the predecessor to the Manchester Mark I . The EDVAC, the computer design for which Von Neumann’s report was originally intended, wasn’t completed until 1949.

Transitioning Toward Transistors

The first modern computers were nothing like the commercial products used by consumers today. They were elaborate hulking contraptions that often took up the space of an entire room. They also sucked enormous amounts of energy and were notoriously buggy. And since these early computers ran on bulky vacuum tubes, scientists hoping to improve processing speeds would either have to find bigger rooms—or come up with an alternative.

Fortunately, that much-needed breakthrough was already in the works. In 1947, a group of scientists at Bell Telephone Laboratories developed a new technology called point-contact transistors. Like vacuum tubes, transistors amplify electrical current and can be used as switches. More importantly, they were much smaller (about the size of an aspirin capsule), more reliable, and they used much less power overall. The co-inventors John Bardeen, Walter Brattain, and William Shockley would eventually be awarded the Nobel Prize in physics in 1956.

While Bardeen and Brattain continued doing research work, Shockley moved to further develop and commercialize transistor technology. One of the first hires at his newly founded company was an electrical engineer named Robert Noyce, who eventually split off and formed his own firm, Fairchild Semiconductor, a division of Fairchild Camera and Instrument. At the time, Noyce was looking into ways to seamlessly combine the transistor and other components into one integrated circuit to eliminate the process in which they had to be pieced together by hand. Thinking along similar lines, Jack Kilby , an engineer at Texas Instruments, ended up filing a patent first. It was Noyce’s design, however, that would be widely adopted.

Where integrated circuits had the most significant impact was in paving the way for the new era of personal computing. Over time, it opened up the possibility of running processes powered by millions of circuits—all on a microchip the size of a postage stamp. In essence, it’s what has enabled the ubiquitous handheld gadgets we use every day, that are ironically, much more powerful than the earliest computers that took up entire rooms. 

  • The History of Laptop Computers
  • The History of the Transistor
  • The History of the UNIVAC Computer
  • The Short but Interesting History of the iPod
  • History of Computer Printers
  • Who Invented the Computer Mouse?
  • ARPAnet: The World's First Internet
  • Who Actually Invented the Macintosh Computer?
  • The First Typewriters
  • The History of Spacewar: The First Computer Game
  • The History of the BASIC Programming Language
  • Who Created Wi-Fi, the Wireless Internet Connection?
  • History of the Atari Video System
  • The Most Impactful Inventions of the Last 300 Years
  • Important Innovations and Inventions, Past and Present
  • History of the Wheelchair

research the history of computer

  • History Classics
  • Your Profile
  • Find History on Facebook (Opens in a new window)
  • Find History on Twitter (Opens in a new window)
  • Find History on YouTube (Opens in a new window)
  • Find History on Instagram (Opens in a new window)
  • Find History on TikTok (Opens in a new window)
  • This Day In History
  • History Podcasts
  • History Vault

Invention of the PC

By: History.com Editors

Updated: March 28, 2023 | Original: May 11, 2011

black pc keyboard, keyboard is very useful tool for personal computer, it is necessary to write words

Today’s personal computers are drastically different from the massive, hulking machines that emerged out of World War II—and the difference isn’t only in their size. By the 1970s, technology had evolved to the point that individuals—mostly hobbyists and electronics buffs—could purchase unassembled PCs or “microcomputers” and program them for fun, but these early PCs could not perform many of the useful tasks that today’s computers can. Users could do mathematical calculations and play simple games, but most of the machines’ appeal lay in their novelty. Today, hundreds of companies sell personal computers, accessories and sophisticated software and games, and PCs are used for a wide range of functions from basic word processing to editing photos to managing budgets. At home and at work, we use our PCs to do almost everything. It is nearly impossible to imagine modern life without them.

Invention of the PC: The Computer Age

The earliest electronic computers were not “personal” in any way: They were enormous and hugely expensive, and they required a team of engineers and other specialists to keep them running. One of the first and most famous of these, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), was built at the University of Pennsylvania to do ballistics calculations for the U.S. military during World War II . ENIAC cost $500,000, weighed 30 tons and took up nearly 2,000 square feet of floor space. On the outside, ENIAC was covered in a tangle of cables, hundreds of blinking lights and nearly 6,000 mechanical switches that its operators used to tell it what to do. On the inside, almost 18,000 vacuum tubes carried electrical signals from one part of the machine to another.

Did you know? Time magazine named the personal computer its 1982 "Man of the Year."

Invention of the PC: Postwar Innovations

ENIAC and other early computers proved to many universities and corporations that the machines were worth the tremendous investment of money, space and manpower they demanded. (For example, ENIAC could solve in 30 seconds a missile-trajectory problem that could take a team of human “computers” 12 hours to complete.) At the same time, new technologies were making it possible to build computers that were smaller and more streamlined. In 1948, Bell Labs introduced the transistor, an electronic device that carried and amplified electrical current but was much smaller than the cumbersome vacuum tube. Ten years later, scientists at Texas Instruments and Fairchild Semiconductor came up with the integrated circuit, an invention that incorporated all of the computer’s electrical parts–transistors, capacitors, resistors and diodes–into a single silicon chip.

But one of the most significant inventions that paved the way for the PC revolution was the microprocessor. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves.

The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. (Intel was located in California’s Santa Clara Valley, a place nicknamed “Silicon Valley” because of all the high-tech companies clustered around the Stanford Industrial Park there.) Intel’s first microprocessor, a 1/16-by-1/8-inch chip called the 4004, had the same computing power as the massive ENIAC.

The Invention of the PC

These innovations made it cheaper and easier to manufacture computers than ever before. As a result, the small, relatively inexpensive “microcomputer”–soon known as the “personal computer”–was born. In 1974, for instance, a company called Micro Instrumentation and Telemetry Systems (MITS) introduced a mail-order build-it-yourself computer kit called the Altair. Compared to earlier microcomputers, the Altair was a huge success: Thousands of people bought the $400 kit. However, it really did not do much. It had no keyboard and no screen, and its output was just a bank of flashing lights. Users input data by flipping toggle switches.

In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. In April 1975 the two young programmers took the money they made from “Altair BASIC” and formed a company of their own—Microsoft—that soon became an empire.

The year after Gates and Allen started Microsoft, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. This computer, called the Apple I, was more sophisticated than the Altair: It had more memory, a cheaper microprocessor and a monitor with a screen. In April 1977, Jobs and Wozniak introduced the Apple II, which had a keyboard and a color screen. Also, users could store their data on an external cassette tape. (Apple soon swapped those tapes for floppy disks.) To make the Apple II as useful as possible, the company encouraged programmers to create “applications” for it. For example, a spreadsheet program called VisiCalc made Apple a practical tool for all kinds of people (and businesses)–not just hobbyists.

The PC Revolution

The PC revolution had begun. Soon companies like Xerox, Tandy, Commodore and IBM entered the market, and computers became ubiquitous in offices and eventually homes. Innovations like the “Graphical User Interface,” which allows users to select icons on the computer screen instead of writing complicated commands, and the computer mouse made PCs even more convenient and user-friendly. Today, laptops, smartphones and tablet computers allow us to have a PC with us wherever we go.

research the history of computer

Sign up for Inside History

Get HISTORY’s most fascinating stories delivered to your inbox three times a week.

By submitting your information, you agree to receive emails from HISTORY and A+E Networks. You can opt out at any time. You must be 16 years or older and a resident of the United States.

More details : Privacy Notice | Terms of Use | Contact Us

History of Computing

  • First Online: 18 April 2023

Cite this chapter

research the history of computer

  • Joseph Migga Kizza 4  

Part of the book series: Texts in Computer Science ((TCS))

627 Accesses

This chapter gives an overview of the history of computing science in hardware, software, and networking, covering prehistoric (prior to 1946) computing devices and computing pioneers since the Abacus . The emergency of social and ethical problems in computing is discussed via the history of computer crimes which started with the invention of the computer virus. We also discuss the growth of computing technologies like the Internet, the Web, and advanced mobile computing technologies and the rise of computer crimes. Finally we introduce the need for computer ethics education as one of the solutions to the growing threat of a cyberspace attack.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Calvello M. A complete history of computers: from 1800s to now. https://www.g2.com/articles/history-of-computers#computers-in-the-1800s

Baron RJ, Heigbie L (1992) Computer architecture. Addison-Wesley, Reading

Google Scholar  

Mackenzie I. The man who invented the microprocessor. http://www.bbc.com/news/technology-13260039

Miller MJ (2011) The rise of DOS: how microsoft got the IBM PC OS contract. PC. http://forwardthinking.pcmag.com/software/286148-the-rise-of-dos-how-microsoft-got-the-ibm-pc-os-contract

Sterling B. Short history of the internet. Internet Society. http://www.internetsociety.org/internet/what-internet/history-internet/short-history-internet

Gribble C (2001) History of the web beginning at CERN. Hitmill

Kizza JM (1998) Civilizing the internet: global concerns and efforts towards regulation. McFarland, Jefferson

Forch K (1994) Computer security management. Boyd & Fraser, Danvers

Carnegie Mellon University, Software Engineering Institute. http://www.cert.org/stats/cert_stats.html#incidents

Kizza JM (2001) Computer network security and cyber ethics. McFarland, Jefferson

Bynum TW (ed) (1985) Computers & ethics. Basil Blackwell, New York

Maner W (1996) Is computer ethics unique? Sci Eng Ethics 2(2):137–154

Article   Google Scholar  

Download references

Author information

Authors and affiliations.

Department of Computer Science and Engineering, University of Tennessee at Chattanooga, Chattanooga, TN, USA

Joseph Migga Kizza

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this chapter

Kizza, J.M. (2023). History of Computing. In: Ethical and Social Issues in the Information Age. Texts in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-031-24863-4_1

Download citation

DOI : https://doi.org/10.1007/978-3-031-24863-4_1

Published : 18 April 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-24862-7

Online ISBN : 978-3-031-24863-4

eBook Packages : Computer Science Computer Science (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Search Menu

Sign in through your institution

  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical Numismatics
  • Classical Literature
  • Classical Reception
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Archaeology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Late Antiquity
  • Religion in the Ancient World
  • Social History
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Agriculture
  • History of Education
  • History of Emotions
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Variation
  • Language Families
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Lexicography
  • Linguistic Theories
  • Linguistic Typology
  • Linguistic Anthropology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Modernism)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Culture
  • Music and Religion
  • Music and Media
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Society
  • Law and Politics
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Legal System - Costs and Funding
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Restitution
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Oncology
  • Medical Toxicology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Medical Ethics
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Games
  • Computer Security
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Neuroscience
  • Cognitive Psychology
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business History
  • Business Strategy
  • Business Ethics
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Social Issues in Business and Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Methodology
  • Economic Systems
  • Economic History
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Management of Land and Natural Resources (Social Science)
  • Natural Disasters (Environment)
  • Pollution and Threats to the Environment (Social Science)
  • Social Impact of Environmental Issues (Social Science)
  • Sustainability
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • Ethnic Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Theory
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Politics and Law
  • Politics of Development
  • Public Administration
  • Public Policy
  • Qualitative Political Methodology
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Disability Studies
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The History of Computing: A Very Short Introduction

The History of Computing: A Very Short Introduction

Author webpage

  • Cite Icon Cite
  • Permissions Icon Permissions

This book describes the central events, machines, and people in the history of computing, and traces how innovation has brought us from pebbles used for counting, to the modern age of the computer. It has a strong historiographical theme that offers a new perspective on how to understand the historical narratives we have constructed, and examines the unspoken assumptions that underpin them. It describes inventions, pioneers, milestone systems, and the context of their use. It starts with counting, and traces change through calculating aids, mechanical calculation, and automatic electronic computation, both digital and analogue. It shows how four threads—calculation, automatic computing, information management, and communications—converged to create the ‘information age’. It examines three master narratives in established histories that are used as aids to marshal otherwise unmanageable levels detail. The treatment is rooted in the principal episodes that make up canonical histories of computing.

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code
  • Add your ORCID iD

Institutional access

Sign in with a library card.

  • Sign in with username/password
  • Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Our books are available by subscription or purchase to libraries and institutions.

Month: Total Views:
October 2022 1
October 2022 1
October 2022 6
October 2022 1
October 2022 1
November 2022 7
November 2022 1
November 2022 2
November 2022 3
November 2022 2
November 2022 2
November 2022 2
December 2022 1
December 2022 6
December 2022 1
December 2022 2
December 2022 4
December 2022 1
December 2022 2
December 2022 1
January 2023 1
January 2023 1
January 2023 2
January 2023 1
January 2023 4
January 2023 4
February 2023 4
February 2023 2
February 2023 5
February 2023 12
February 2023 1
February 2023 6
February 2023 1
March 2023 15
March 2023 5
March 2023 6
March 2023 6
March 2023 10
March 2023 6
March 2023 5
March 2023 1
April 2023 1
April 2023 1
April 2023 5
May 2023 4
May 2023 1
May 2023 1
May 2023 2
May 2023 3
May 2023 1
May 2023 1
June 2023 1
June 2023 4
June 2023 1
June 2023 1
July 2023 1
July 2023 6
July 2023 1
July 2023 4
July 2023 3
July 2023 2
July 2023 1
July 2023 2
August 2023 8
August 2023 2
August 2023 3
August 2023 5
August 2023 1
August 2023 11
August 2023 4
August 2023 2
August 2023 8
August 2023 4
September 2023 3
September 2023 1
September 2023 1
October 2023 1
October 2023 2
October 2023 4
November 2023 2
November 2023 3
November 2023 1
November 2023 4
December 2023 3
December 2023 1
December 2023 2
December 2023 1
December 2023 6
December 2023 1
December 2023 1
December 2023 1
December 2023 2
December 2023 2
December 2023 2
December 2023 1
January 2024 1
January 2024 1
January 2024 1
January 2024 3
January 2024 4
January 2024 2
January 2024 2
January 2024 3
January 2024 2
February 2024 5
February 2024 1
February 2024 4
February 2024 3
February 2024 3
February 2024 4
February 2024 6
February 2024 4
February 2024 5
February 2024 1
February 2024 1
March 2024 2
March 2024 1
March 2024 2
March 2024 1
March 2024 4
March 2024 2
March 2024 6
March 2024 2
April 2024 1
April 2024 4
April 2024 6
April 2024 6
April 2024 4
April 2024 2
April 2024 3
April 2024 4
April 2024 1
April 2024 2
May 2024 1
May 2024 3
May 2024 2
May 2024 1
May 2024 8
May 2024 4
May 2024 1
June 2024 5
June 2024 2
June 2024 4
June 2024 5
June 2024 4
June 2024 4
June 2024 8
June 2024 10
June 2024 5
June 2024 4
June 2024 7
June 2024 3
July 2024 2
July 2024 3
July 2024 5
July 2024 4
July 2024 2
July 2024 5
July 2024 2
July 2024 4
August 2024 2
August 2024 1
August 2024 2
August 2024 1
August 2024 3
August 2024 2
August 2024 3
August 2024 3
August 2024 1
August 2024 3
August 2024 1
August 2024 2

External resource

  • In the OUP print catalogue
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

History of Computers

Before computers were developed people used sticks, stones, and bones as counting tools. As technology advanced and the human mind improved with time more computing devices were developed like Abacus, Napier’s Bones, etc. These devices were used as computers for performing mathematical computations but not very complex ones. 

Some of the popular computing devices are described below, starting from the oldest to the latest or most advanced technology developed:

Around 4000 years ago, the Chinese invented the Abacus, and it is believed to be the first computer. The history of computers begins with the birth of the abacus.

Structure: Abacus is basically a wooden rack that has metal rods with beads mounted on them.

Working of abacus: In the abacus, the beads were moved by the abacus operator according to some rules to perform arithmetic calculations. In some countries like China, Russia, and Japan, the abacus is still used by their people.

Napier’s Bones

Napier’s Bones was a manually operated calculating device and as the name indicates, it was invented by John Napier. In this device, he used 9 different ivory strips (bones) marked with numbers to multiply and divide for calculation. It was also the first machine to use the decimal point system for calculation.

It is also called an Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. It was the first mechanical and automatic calculator. It is invented by Pascal to help his father, a tax accountant in his work or calculation. It could perform addition and subtraction in quick time. It was basically a wooden box with a series of gears and wheels. It is worked by rotating wheel like when a wheel is rotated one revolution, it rotates the neighbouring wheel and a series of windows is given on the top of the wheels to read the totals.

Stepped Reckoner or Leibniz wheel

A German mathematician-philosopher Gottfried Wilhelm Leibniz in 1673 developed this device by improving Pascal’s invention to develop this machine. It was basically a digital mechanical calculator, and it was called the stepped reckoner as it was made of fluted drums instead of gears (used in the previous model of Pascaline).

Difference Engine

Charles Babbage who is also known as the “Father of Modern Computer” designed the Difference Engine in the early 1820s. Difference Engine was a mechanical computer which is capable of performing simple calculations. It works with help of steam as it was a steam-driven calculating machine, and it was designed to solve tables of numbers like logarithm tables.

Analytical Engine

Again in 1830 Charles Babbage developed another calculating machine which was Analytical Engine. Analytical Engine was a mechanical computer that used punch cards as input. It was capable of performing or solving any mathematical problem and storing information as a permanent memory (storage).

Tabulating Machine

Herman Hollerith, an American statistician invented this machine in the year 1890. Tabulating Machine was a mechanical tabulator that was based on punch cards. It was capable of tabulating statistics and record or sort data or information. This machine was used by U.S. Census in the year 1890. Hollerith’s Tabulating Machine Company was started by Hollerith and this company later became International Business Machine (IBM) in the year 1924.

Differential Analyzer

Differential Analyzer was the first electronic computer introduced in the year 1930 in the United States. It was basically an analog device that was invented by Vannevar Bush. This machine consists of vacuum tubes to switch electrical signals to perform calculations. It was capable of doing 25 calculations in a few minutes.

In the year 1937, major changes began in the history of computers when Howard Aiken planned to develop a machine that could perform large calculations or calculations involving large numbers. In the year 1944, Mark I computer was built as a partnership between IBM and Harvard. It was also the first programmable digital computer marking a new era in the computer world.

Generations of Computers

First Generation Computers

In the period of the year 1940-1956, it was referred to as the period of the first generation of computers. These machines are slow, huge, and expensive. In this generation of computers, vacuum tubes were used as the basic components of CPU and memory. Also, they were mainly dependent on the batch operating systems and punch cards. Magnetic tape and paper tape were used as output and input devices. For example ENIAC, UNIVAC-1, EDVAC, etc.

Second Generation Computers

In the period of the year, 1957-1963 was referred to as the period of the second generation of computers. It was the time of the transistor computers. In the second generation of computers, transistors (which were cheap in cost) are used. Transistors are also compact and consume less power. Transistor computers are faster than first-generation computers. For primary memory, magnetic cores were used, and for secondary memory magnetic disc and tapes for storage purposes. In second-generation computers, COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

For example IBM 1620, IBM 7094, CDC 1604, CDC 3600, etc.

Third Generation Computers

In the third generation of computers, integrated circuits (ICs) were used instead of transistors(in the second generation). A single IC consists of many transistors which increased the power of a computer and also reduced the cost. The third generation computers are more reliable, efficient, and smaller in size. It used remote processing, time-sharing, and multiprogramming as operating systems. FORTRON-II TO IV, COBOL, and PASCAL PL/1 were used which are high-level programming languages.

For example IBM-360 series, Honeywell-6000 series, IBM-370/168, etc.

Fourth Generation Computers

The period of 1971-1980 was mainly the time of fourth generation computers. It used VLSI(Very Large Scale Integrated) circuits. VLSI is a chip containing millions of transistors and other circuit elements and because of these chips, the computers of this generation are more compact, powerful, fast, and affordable(low in cost). Real-time, time-sharing and distributed operating system are used by these computers. C and C++ are used as the programming languages in this generation of computers.

For example STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, etc.

Fifth Generation Computers

From 1980 – to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth-generation computers instead of the VLSI technology of fourth-generation computers. Microprocessor chips with ten million electronic components are used in these computers. Parallel processing hardware and AI (Artificial Intelligence) software are also used in fifth-generation computers. The programming languages like C, C++, Java, .Net, etc. are used.

For example Desktop, Laptop, NoteBook, UltraBook, etc.

Sample Questions

Let us now see some sample questions on the History of computers:

Question 1: Arithmetic Machine or Adding Machine is used between ___________ years.

a. 1642 and 1644

b. Around 4000 years ago

c. 1946 – 1956

d. None of the above

Solution:  

a. 1642 and 1644 Explanation: Pascaline is also called as Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. 

Question 2: Who designed the Difference Engine?

a. Blaise Pascal

b. Gottfried Wilhelm Leibniz 

c. Vannevar Bush

d. Charles Babbage 

Solution: 

d. Charles Babbage  Explanation: Charles Babbage who is also known as “Father of Modern Computer” designed the Difference Engine in the early 1820s.

Question 3: In second generation computers _______________ are used as Assembly language and programming languages.

a. C and C++.

b. COBOL and FORTRAN 

c. C and .NET

d. None of the above.

b. COBOL and FORTRAN  Explanation: In second generation computers COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

Question 4: ENIAC and UNIVAC-1 are examples of which generation of computers?

a. First generation of computers.

b. Second generation of computers. 

c. Third generation of computers. 

d. Fourth generation of computers.  

a. First-generation of computers. Explanation: ENIAC, UNIVAC-1, EDVAC, etc. are examples of the first generation of computers.

Question 5: The ______________ technology is used in fifth generation computers .

a. ULSI (Ultra Large Scale Integration)

b. VLSI( very large scale integrated)

c. vacuum tubes

d. All of the above

a. ULSI (Ultra Large Scale Integration) Explanation: From 1980 -to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth generation computers. 

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming
  • California Lawmakers Pass Bill to Limit AI Replicas
  • Best 10 IPTV Service Providers in Germany
  • Python 3.13 Releases | Enhanced REPL for Developers
  • IPTV Anbieter in Deutschland - Top IPTV Anbieter Abonnements
  • Content Improvement League 2024: From Good To A Great Article

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Pardon Our Interruption

As you were browsing something about your browser made us think you were a bot. There are a few reasons this might happen:

  • You've disabled JavaScript in your web browser.
  • You're a power user moving through this website with super-human speed.
  • You've disabled cookies in your web browser.
  • A third-party browser plugin, such as Ghostery or NoScript, is preventing JavaScript from running. Additional information is available in this support article .

To regain access, please make sure that cookies and JavaScript are enabled before reloading the page.

arXiv's Accessibility Forum starts next month!

Help | Advanced Search

Computer Science > Computer Vision and Pattern Recognition

Title: evaluation framework for feedback generation methods in skeletal movement assessment.

Abstract: The application of machine-learning solutions to movement assessment from skeleton videos has attracted significant research attention in recent years. This advancement has made rehabilitation at home more accessible, utilizing movement assessment algorithms that can operate on affordable equipment for human pose detection and analysis from 2D or 3D videos. While the primary objective of automatic assessment tasks is to score movements, the automatic generation of feedback highlighting key movement issues has the potential to significantly enhance and accelerate the rehabilitation process. While numerous research works exist in the field of automatic movement assessment, only a handful address feedback generation. In this study, we propose terminology and criteria for the classification, evaluation, and comparison of feedback generation solutions. We discuss the challenges associated with each feedback generation approach and use our proposed criteria to classify existing solutions. To our knowledge, this is the first work that formulates feedback generation in skeletal movement assessment.
Comments: Accepted to xAI4Biometrics 2024 at ECCV 2024
Subjects: Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
Cite as: [cs.CV]
  (or [cs.CV] for this version)
  Focus to learn more arXiv-issued DOI via DataCite

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

  • Collections
  • Publications
  • K-12 Students & Educators
  • Families & Community Groups
  • Colleges & Universities
  • Business & Government Leaders
  • Make a Plan
  • Exhibits at the Museum
  • Tours & Group Reservations
  • Customize It
  • This is CHM
  • Ways to Give
  • Donor Recognition
  • Institutional Partnerships
  • Upcoming Events
  • Hours & Directions
  • Subscribe Now
  • Terms of Use
  • By Category

Bell Laboratories scientist George Stibitz uses relays for a demonstration adder

research the history of computer

“Model K” Adder

Called the “Model K” Adder because he built it on his “Kitchen” table, this simple demonstration circuit provides proof of concept for applying Boolean logic to the design of computers, resulting in construction of the relay-based Model I Complex Calculator in 1939. That same year in Germany, engineer Konrad Zuse built his Z2 computer, also using telephone company relays.

Hewlett-Packard is founded

research the history of computer

Hewlett and Packard in their garage workshop

David Packard and Bill Hewlett found their company in a Palo Alto, California garage. Their first product, the HP 200A Audio Oscillator, rapidly became a popular piece of test equipment for engineers. Walt Disney Pictures ordered eight of the 200B model to test recording equipment and speaker systems for the 12 specially equipped theatres that showed the movie “Fantasia” in 1940.

The Complex Number Calculator (CNC) is completed

research the history of computer

Operator at Complex Number Calculator (CNC)

In 1939, Bell Telephone Laboratories completes this calculator, designed by scientist George Stibitz. In 1940, Stibitz demonstrated the CNC at an American Mathematical Society conference held at Dartmouth College. Stibitz stunned the group by performing calculations remotely on the CNC (located in New York City) using a Teletype terminal connected to New York over special telephone lines. This is likely the first example of remote access computing.

Konrad Zuse finishes the Z3 Computer

research the history of computer

The Zuse Z3 Computer

The Z3, an early computer built by German engineer Konrad Zuse working in complete isolation from developments elsewhere, uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943. Zuse later supervised a reconstruction of the Z3 in the 1960s, which is currently on display at the Deutsches Museum in Munich.

The first Bombe is completed

research the history of computer

Bombe replica, Bletchley Park, UK

Built as an electro-mechanical means of decrypting Nazi ENIGMA-based military communications during World War II, the British Bombe is conceived of by computer pioneer Alan Turing and Harold Keen of the British Tabulating Machine Company. Hundreds of allied bombes were built in order to determine the daily rotor start positions of Enigma cipher machines, which in turn allowed the Allies to decrypt German messages. The basic idea for bombes came from Polish code-breaker Marian Rejewski's 1938 "Bomba."

The Atanasoff-Berry Computer (ABC) is completed

research the history of computer

The Atanasoff-Berry Computer

After successfully demonstrating a proof-of-concept prototype in 1939, Professor John Vincent Atanasoff receives funds to build a full-scale machine at Iowa State College (now University). The machine was designed and built by Atanasoff and graduate student Clifford Berry between 1939 and 1942. The ABC was at the center of a patent dispute related to the invention of the computer, which was resolved in 1973 when it was shown that ENIAC co-designer John Mauchly had seen the ABC shortly after it became functional.

The legal result was a landmark: Atanasoff was declared the originator of several basic computer ideas, but the computer as a concept was declared un-patentable and thus freely open to all. A full-scale working replica of the ABC was completed in 1997, proving that the ABC machine functioned as Atanasoff had claimed. The replica is currently on display at the Computer History Museum.

Bell Labs Relay Interpolator is completed

research the history of computer

George Stibitz circa 1940

The US Army asked Bell Laboratories to design a machine to assist in testing its M-9 gun director, a type of analog computer that aims large guns to their targets. Mathematician George Stibitz recommends using a relay-based calculator for the project. The result was the Relay Interpolator, later called the Bell Labs Model II. The Relay Interpolator used 440 relays, and since it was programmable by paper tape, was used for other applications following the war.

Curt Herzstark designs Curta calculator

research the history of computer

Curta Model 1 calculator

Curt Herzstark was an Austrian engineer who worked in his family’s manufacturing business until he was arrested by the Nazis in 1943. While imprisoned at Buchenwald concentration camp for the rest of World War II, he refines his pre-war design of a calculator featuring a modified version of Leibniz’s “stepped drum” design. After the war, Herzstark’s Curta made history as the smallest all-mechanical, four-function calculator ever built.

First Colossus operational at Bletchley Park

research the history of computer

The Colossus at work at Bletchley Park

Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II. A total of ten Colossi were delivered, each using as many as 2,500 vacuum tubes. A series of pulleys transported continuous rolls of punched paper tape containing possible solutions to a particular code. Colossus reduced the time to break Lorenz messages from weeks to hours. Most historians believe that the use of Colossus machines significantly shortened the war by providing evidence of enemy intentions and beliefs. The machine’s existence was not made public until the 1970s.

Harvard Mark 1 is completed

research the history of computer

Conceived by Harvard physics professor Howard Aiken, and designed and built by IBM, the Harvard Mark 1 is a room-sized, relay-based calculator. The machine had a fifty-foot long camshaft running the length of machine that synchronized the machine’s thousands of component parts and used 3,500 relays. The Mark 1 produced mathematical tables but was soon superseded by electronic stored-program computers.

John von Neumann writes First Draft of a Report on the EDVAC

research the history of computer

John von Neumann

In a widely circulated paper, mathematician John von Neumann outlines the architecture of a stored-program computer, including electronic storage of programming information and data -- which eliminates the need for more clumsy methods of programming such as plugboards, punched cards and paper. Hungarian-born von Neumann demonstrated prodigious expertise in hydrodynamics, ballistics, meteorology, game theory, statistics, and the use of mechanical devices for computation. After the war, he concentrated on the development of Princeton´s Institute for Advanced Studies computer.

Moore School lectures take place

research the history of computer

The Moore School Building at the University of Pennsylvania

An inspiring summer school on computing at the University of Pennsylvania´s Moore School of Electrical Engineering stimulates construction of stored-program computers at universities and research institutions in the US, France, the UK, and Germany. Among the lecturers were early computer designers like John von Neumann, Howard Aiken, J. Presper Eckert and John Mauchly, as well as mathematicians including Derrick Lehmer, George Stibitz, and Douglas Hartree. Students included future computing pioneers such as Maurice Wilkes, Claude Shannon, David Rees, and Jay Forrester. This free, public set of lectures inspired the EDSAC, BINAC, and, later, IAS machine clones like the AVIDAC.

Project Whirlwind begins

research the history of computer

Whirlwind installation at MIT

During World War II, the US Navy approaches the Massachusetts Institute of Technology (MIT) about building a flight simulator to train bomber crews. Under the leadership of MIT's Gordon Brown and Jay Forrester, the team first built a small analog simulator, but found it inaccurate and inflexible. News of the groundbreaking electronic ENIAC computer that same year inspired the group to change course and attempt a digital solution, whereby flight variables could be rapidly programmed in software. Completed in 1951, Whirlwind remains one of the most important computer projects in the history of computing. Foremost among its developments was Forrester’s perfection of magnetic core memory, which became the dominant form of high-speed random access memory for computers until the mid-1970s.

Public unveiling of ENIAC

research the history of computer

Started in 1943, the ENIAC computing system was built by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering of the University of Pennsylvania. Because of its electronic, as opposed to electromechanical, technology, it is over 1,000 times faster than any previous computer. ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes and weighed 30 tons. It was believed that ENIAC had done more calculation over the ten years it was in operation than all of humanity had until that time.

First Computer Program to Run on a Computer

research the history of computer

Kilburn (left) and Williams in front of 'Baby'

University of Manchester researchers Frederic Williams, Tom Kilburn, and Geoff Toothill develop the Small-Scale Experimental Machine (SSEM), better known as the Manchester "Baby." The Baby was built to test a new memory technology developed by Williams and Kilburn -- soon known as the Williams Tube – which was the first high-speed electronic random access memory for computers. Their first program, consisting of seventeen instructions and written by Kilburn, ran on June 21st, 1948. This was the first program in history to run on a digital, electronic, stored-program computer.

SSEC goes on display

research the history of computer

IBM Selective Sequence Electronic Calculator (SSEC)

The Selective Sequence Electronic Calculator (SSEC) project, led by IBM engineer Wallace Eckert, uses both relays and vacuum tubes to process scientific data at the rate of 50 14 x 14 digit multiplications per second. Before its decommissioning in 1952, the SSEC produced the moon position tables used in early planning of the 1969 Apollo XII moon landing. These tables were later confirmed by using more modern computers for the actual flights. The SSEC was one of the last of the generation of 'super calculators' to be built using electromechanical technology.

CSIRAC runs first program

research the history of computer

While many early digital computers were based on similar designs, such as the IAS and its copies, others are unique designs, like the CSIRAC. Built in Sydney, Australia by the Council of Scientific and Industrial Research for use in its Radio physics Laboratory in Sydney, CSIRAC was designed by British-born Trevor Pearcey, and used unusual 12-hole paper tape. It was transferred to the Department of Physics at the University of Melbourne in 1955 and remained in service until 1964.

EDSAC completed

research the history of computer

The first practical stored-program computer to provide a regular computing service, EDSAC is built at Cambridge University using vacuum tubes and mercury delay lines for memory. The EDSAC project was led by Cambridge professor and director of the Cambridge Computation Laboratory, Maurice Wilkes. Wilkes' ideas grew out of the Moore School lectures he had attended three years earlier. One major advance in programming was Wilkes' use of a library of short programs, called “subroutines,” stored on punched paper tapes and used for performing common repetitive calculations within a larger program.

MADDIDA developed

research the history of computer

MADDIDA (Magnetic Drum Digital Differential Analyzer) prototype

MADDIDA is a digital drum-based differential analyzer. This type of computer is useful in performing many of the mathematical equations scientists and engineers encounter in their work. It was originally created for a nuclear missile design project in 1949 by a team led by Fred Steele. It used 53 vacuum tubes and hundreds of germanium diodes, with a magnetic drum for memory. Tracks on the drum did the mathematical integration. MADDIDA was flown across the country for a demonstration to John von Neumann, who was impressed. Northrop was initially reluctant to make MADDIDA a commercial product, but by the end of 1952, six had sold.

Manchester Mark I completed

research the history of computer

Manchester Mark I

Built by a team led by engineers Frederick Williams and Tom Kilburn, the Mark I serves as the prototype for Ferranti’s first computer – the Ferranti Mark 1. The Manchester Mark I used more than 1,300 vacuum tubes and occupied an area the size of a medium room. Its “Williams-Kilburn tube” memory system was later adopted by several other early computer systems around the world.

ERA 1101 introduced

research the history of computer

One of the first commercially produced computers, the company´s first customer was the US Navy. The 1101, designed by ERA but built by Remington-Rand, was intended for high-speed computing and stored 1 million bits on its magnetic drum, one of the earliest magnetic storage devices and a technology which ERA had done much to perfect in its own laboratories. Many of the 1101’s basic architectural details were used again in later Remington-Rand computers until the 1960s.

NPL Pilot ACE completed

research the history of computer

Based on ideas from Alan Turing, Britain´s Pilot ACE computer is constructed at the National Physical Laboratory. "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus," Turing said at a symposium on large-scale digital calculating machinery in 1947 in Cambridge, Massachusetts. The design packed 800 vacuum tubes into a relatively compact 12 square feet.

Plans to build the Simon 1 relay logic machine are published

research the history of computer

Simon featured on the November 1950 Scientific American cover

The hobbyist magazine Radio Electronics publishes Edmund Berkeley's design for the Simon 1 relay computer from 1950 to 1951. The Simon 1 used relay logic and cost about $600 to build. In his book Giant Brains , Berkeley noted - “We shall now consider how we can design a very simple machine that will think. Let us call it Simon, because of its predecessor, Simple Simon... Simon is so simple and so small in fact that it could be built to fill up less space than a grocery-store box; about four cubic feet.”

SEAC and SWAC completed

research the history of computer

The Standards Eastern Automatic Computer (SEAC) is among the first stored program computers completed in the United States. It was built in Washington DC as a test-bed for evaluating components and systems as well as for setting computer standards. It was also one of the first computers to use all-diode logic, a technology more reliable than vacuum tubes. The world's first scanned image was made on SEAC by engineer Russell Kirsch in 1957.

The NBS also built the Standards Western Automatic Computer (SWAC) at the Institute for Numerical Analysis on the UCLA campus. Rather than testing components like the SEAC, the SWAC was built using already-developed technology. SWAC was used to solve problems in numerical analysis, including developing climate models and discovering five previously unknown Mersenne prime numbers.

Ferranti Mark I sold

research the history of computer

Ferranti Mark 1

The title of “first commercially available general-purpose computer” probably goes to Britain’s Ferranti Mark I for its sale of its first Mark I computer to Manchester University. The Mark 1 was a refinement of the experimental Manchester “Baby” and Manchester Mark 1 computers, also at Manchester University. A British government contract spurred its initial development but a change in government led to loss of funding and the second and only other Mark I was sold at a major loss to the University of Toronto, where it was re-christened FERUT.

First Univac 1 delivered to US Census Bureau

research the history of computer

Univac 1 installation

The Univac 1 is the first commercial computer to attract widespread public attention. Although manufactured by Remington Rand, the machine was often mistakenly referred to as “the IBM Univac." Univac computers were used in many different applications but utilities, insurance companies and the US military were major customers. One biblical scholar even used a Univac 1 to compile a concordance to the King James version of the Bible. Created by Presper Eckert and John Mauchly -- designers of the earlier ENIAC computer -- the Univac 1 used 5,200 vacuum tubes and weighed 29,000 pounds. Remington Rand eventually sold 46 Univac 1s at more than $1 million each.

J. Lyons & Company introduce LEO-1

research the history of computer

Modeled after the Cambridge University EDSAC computer, the president of Lyons Tea Co. has the LEO built to solve the problem of production scheduling and delivery of cakes to the hundreds of Lyons tea shops around England. After the success of the first LEO, Lyons went into business manufacturing computers to meet the growing need for data processing systems in business. The LEO was England’s first commercial computer and was performing useful work before any other commercial computer system in the world.

IAS computer operational

research the history of computer

MANIAC at Los Alamos

The Institute of Advanced Study (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous mathematician John von Neumann. The notion of storing both data and instructions in memory became known as the ‘stored program concept’ to distinguish it from earlier methods of instructing a computer. The IAS computer was designed for scientific calculations and it performed essential work for the US atomic weapons program. Over the next few years, the basic design of the IAS machine was copied in at least 17 places and given similar-sounding names, for example, the MANIAC at Los Alamos Scientific Laboratory; the ILLIAC at the University of Illinois; the Johnniac at The Rand Corporation; and the SILLIAC in Australia.

Grimsdale and Webb build early transistorized computer

research the history of computer

Manchester transistorized computer

Working under Tom Kilburn at England’s Manchester University, Richard Grimsdale and Douglas Webb demonstrate a prototype transistorized computer, the "Manchester TC", on November 16, 1953. The 48-bit machine used 92 point-contact transistors and 550 diodes.

IBM ships its Model 701 Electronic Data Processing Machine

research the history of computer

Cuthbert Hurd (standing) and Thomas Watson, Sr. at IBM 701 console

During three years of production, IBM sells 19 701s to research laboratories, aircraft companies, and the federal government. Also known inside IBM as the “Defense Calculator," the 701 rented for $15,000 a month. Programmer Arthur Samuels used the 701 to write the first computer program designed to play checkers. The 701 introduction also marked the beginning of IBM’s entry into the large-scale computer market, a market it came to dominate in later decades.

RAND Corporation completes Johnniac computer

research the history of computer

RAND Corporation’s Johnniac

The Johnniac computer is one of 17 computers that followed the basic design of Princeton's Institute of Advanced Study (IAS) computer. It was named after John von Neumann, a world famous mathematician and computer pioneer of the day. Johnniac was used for scientific and engineering calculations. It was also repeatedly expanded and improved throughout its 13-year lifespan. Many innovative programs were created for Johnniac, including the time-sharing system JOSS that allowed many users to simultaneously access the machine.

IBM 650 magnetic drum calculator introduced

research the history of computer

IBM establishes the 650 as its first mass-produced computer, with the company selling 450 in just one year. Spinning at 12,500 rpm, the 650´s magnetic data-storage drum allowed much faster access to stored information than other drum-based machines. The Model 650 was also highly popular in universities, where a generation of students first learned programming.

English Electric DEUCE introduced

research the history of computer

English Electric DEUCE

A commercial version of Alan Turing's Pilot ACE, called DEUCE—the Digital Electronic Universal Computing Engine -- is used mostly for science and engineering problems and a few commercial applications. Over 30 were completed, including one delivered to Australia.

Direct keyboard input to computers

research the history of computer

Joe Thompson at Whirlwind console, ca. 1951

At MIT, researchers begin experimenting with direct keyboard input to computers, a precursor to today´s normal mode of operation. Typically, computer users of the time fed their programs into a computer using punched cards or paper tape. Doug Ross wrote a memo advocating direct access in February. Ross contended that a Flexowriter -- an electrically-controlled typewriter -- connected to an MIT computer could function as a keyboard input device due to its low cost and flexibility. An experiment conducted five months later on the MIT Whirlwind computer confirmed how useful and convenient a keyboard input device could be.

Librascope LGP-30 introduced

research the history of computer

Physicist Stan Frankel, intrigued by small, general-purpose computers, developed the MINAC at Caltech. The Librascope division of defense contractor General Precision buys Frankel’s design, renaming it the LGP-30 in 1956. Used for science and engineering as well as simple data processing, the LGP-30 was a “bargain” at less than $50,000 and an early example of a ‘personal computer,’ that is, a computer made for a single user.

MIT researchers build the TX-0

research the history of computer

TX-0 at MIT

The TX-0 (“Transistor eXperimental - 0”) is the first general-purpose programmable computer built with transistors. For easy replacement, designers placed each transistor circuit inside a "bottle," similar to a vacuum tube. Constructed at MIT´s Lincoln Laboratory, the TX-0 moved to the MIT Research Laboratory of Electronics, where it hosted some early imaginative tests of programming, including writing a Western movie shown on television, 3-D tic-tac-toe, and a maze in which a mouse found martinis and became increasingly inebriated.

Digital Equipment Corporation (DEC) founded

research the history of computer

The Maynard mill

DEC is founded initially to make electronic modules for test, measurement, prototyping and control markets. Its founders were Ken and Stan Olsen, and Harlan Anderson. Headquartered in Maynard, Massachusetts, Digital Equipment Corporation, took over 8,680 square foot leased space in a nineteenth century mill that once produced blankets and uniforms for soldiers who fought in the Civil War. General Georges Doriot and his pioneering venture capital firm, American Research and Development, invested $70,000 for 70% of DEC’s stock to launch the company in 1957. The mill is still in use today as an office park (Clock Tower Place) today.

RCA introduces its Model 501 transistorized computer

research the history of computer

RCA 501 brochure cover

The 501 is built on a 'building block' concept which allows it to be highly flexible for many different uses and could simultaneously control up to 63 tape drives—very useful for large databases of information. For many business users, quick access to this huge storage capability outweighed its relatively slow processing speed. Customers included US military as well as industry.

SAGE system goes online

research the history of computer

SAGE Operator Station

The first large-scale computer communications network, SAGE connects 23 hardened computer sites in the US and Canada. Its task was to detect incoming Soviet bombers and direct interceptor aircraft to destroy them. Operators directed actions by touching a light gun to the SAGE airspace display. The air defense system used two AN/FSQ-7 computers, each of which used a full megawatt of power to drive its 55,000 vacuum tubes, 175,000 diodes and 13,000 transistors.

DEC PDP-1 introduced

research the history of computer

Ed Fredkin at DEC PDP-1

The typical PDP-1 computer system, which sells for about $120,000, includes a cathode ray tube graphic display, paper tape input/output, needs no air conditioning and requires only one operator; all of which become standards for minicomputers. Its large scope intrigued early hackers at MIT, who wrote the first computerized video game, SpaceWar! , as well as programs to play music. More than 50 PDP-1s were sold.

NEAC 2203 goes online

research the history of computer

NEAC 2203 transistorized computer

An early transistorized computer, the NEAC (Nippon Electric Automatic Computer) includes a CPU, console, paper tape reader and punch, printer and magnetic tape units. It was sold exclusively in Japan, but could process alphabetic and Japanese kana characters. Only about thirty NEACs were sold. It managed Japan's first on-line, real-time reservation system for Kinki Nippon Railways in 1960. The last one was decommissioned in 1979.

IBM 7030 (“Stretch”) completed

research the history of computer

IBM Stretch

IBM´s 7000 series of mainframe computers are the company´s first to use transistors. At the top of the line was the Model 7030, also known as "Stretch." Nine of the computers, which featured dozens of advanced design innovations, were sold, mainly to national laboratories and major scientific users. A special version, known as HARVEST, was developed for the US National Security Agency (NSA). The knowledge and technologies developed for the Stretch project played a major role in the design, management, and manufacture of the later IBM System/360--the most successful computer family in IBM history.

IBM Introduces 1400 series

research the history of computer

The 1401 mainframe, the first in the series, replaces earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.

Minuteman I missile guidance computer developed

research the history of computer

Minuteman Guidance computer

Minuteman missiles use transistorized computers to continuously calculate their position in flight. The computer had to be rugged and fast, with advanced circuit design and reliable packaging able to withstand the forces of a missile launch. The military’s high standards for its transistors pushed manufacturers to improve quality control. When the Minuteman I was decommissioned, some universities received these computers for use by students.

Naval Tactical Data System introduced

research the history of computer

Naval Tactical Data System (NTDS)

The US Navy Tactical Data System uses computers to integrate and display shipboard radar, sonar and communications data. This real-time information system began operating in the early 1960s. In October 1961, the Navy tested the NTDS on the USS Oriskany carrier and the USS King and USS Mahan frigates. After being successfully used for decades, NTDS was phased out in favor of the newer AEGIS system in the 1980s.

MIT LINC introduced

research the history of computer

Wesley Clark with LINC

The LINC is an early and important example of a ‘personal computer,’ that is, a computer designed for only one user. It was designed by MIT Lincoln Laboratory engineer Wesley Clark. Under the auspices of a National Institutes of Health (NIH) grant, biomedical research faculty from around the United States came to a workshop at MIT to build their own LINCs, and then bring them back to their home institutions where they would be used. For research, Digital Equipment Corporation (DEC) supplied the components, and 50 original LINCs were made. The LINC was later commercialized by DEC and sold as the LINC-8.

The Atlas Computer debuts

research the history of computer

Chilton Atlas installation

A joint project of England’s Manchester University, Ferranti Computers, and Plessey, Atlas comes online nine years after Manchester’s computer lab begins exploring transistor technology. Atlas was the fastest computer in the world at the time and introduced the concept of “virtual memory,” that is, using a disk or drum as an extension of main memory. System control was provided through the Atlas Supervisor, which some consider to be the first true operating system.

CDC 6600 supercomputer introduced

research the history of computer

The Control Data Corporation (CDC) 6600 performs up to 3 million instructions per second —three times faster than that of its closest competitor, the IBM 7030 supercomputer. The 6600 retained the distinction of being the fastest computer in the world until surpassed by its successor, the CDC 7600, in 1968. Part of the speed came from the computer´s design, which used 10 small computers, known as peripheral processing units, to offload the workload from the central processor.

Digital Equipment Corporation introduces the PDP-8

research the history of computer

PDP-8 advertisement

The Canadian Chalk River Nuclear Lab needed a special device to monitor a reactor. Instead of designing a custom controller, two young engineers from Digital Equipment Corporation (DEC) -- Gordon Bell and Edson de Castro -- do something unusual: they develop a small, general purpose computer and program it to do the job. A later version of that machine became the PDP-8, the first commercially successful minicomputer. The PDP-8 sold for $18,000, one-fifth the price of a small IBM System/360 mainframe. Because of its speed, small size, and reasonable cost, the PDP-8 was sold by the thousands to manufacturing plants, small businesses, and scientific laboratories around the world.

IBM announces System/360

research the history of computer

IBM 360 Model 40

System/360 is a major event in the history of computing. On April 7, IBM announced five models of System/360, spanning a 50-to-1 performance range. At the same press conference, IBM also announced 40 completely new peripherals for the new family. System/360 was aimed at both business and scientific customers and all models could run the same software, largely without modification. IBM’s initial investment of $5 billion was quickly returned as orders for the system climbed to 1,000 per month within two years. At the time IBM released the System/360, the company had just made the transition from discrete transistors to integrated circuits, and its major source of revenue began to move from punched card equipment to electronic computer systems.

SABRE comes on-line

research the history of computer

Airline reservation agents working with SABRE

SABRE is a joint project between American Airlines and IBM. Operational by 1964, it was not the first computerized reservation system, but it was well publicized and became very influential. Running on dual IBM 7090 mainframe computer systems, SABRE was inspired by IBM’s earlier work on the SAGE air-defense system. Eventually, SABRE expanded, even making airline reservations available via on-line services such as CompuServe, Genie, and America Online.

Teletype introduced its ASR-33 Teletype

research the history of computer

Student using ASR-33

At a cost to computer makers of roughly $700, the ASR-33 Teletype is originally designed as a low cost terminal for the Western Union communications network. Throughout the 1960s and ‘70s, the ASR-33 was a popular and inexpensive choice of input and output device for minicomputers and many of the first generation of microcomputers.

3C DDP-116 introduced

research the history of computer

DDP-116 General Purpose Computer

Designed by engineer Gardner Hendrie for Computer Control Corporation (CCC), the DDP-116 is announced at the 1965 Spring Joint Computer Conference. It was the world's first commercial 16-bit minicomputer and 172 systems were sold. The basic computer cost $28,500.

Olivetti Programma 101 is released

research the history of computer

Olivetti Programma 101

Announced the year previously at the New York World's Fair the Programma 101 goes on sale. This printing programmable calculator was made from discrete transistors and an acoustic delay-line memory. The Programma 101 could do addition, subtraction, multiplication, and division, as well as calculate square roots. 40,000 were sold, including 10 to NASA for use on the Apollo space project.

HP introduces the HP 2116A

research the history of computer

HP 2116A system

The 2116A is HP’s first computer. It was developed as a versatile instrument controller for HP's growing family of programmable test and measurement products. It interfaced with a wide number of standard laboratory instruments, allowing customers to computerize their instrument systems. The 2116A also marked HP's first use of integrated circuits in a commercial product.

ILLIAC IV project begins

research the history of computer

A large parallel processing computer, the ILLIAC IV does not operate until 1972. It was eventually housed at NASA´s Ames Research Center in Mountain View, California. The most ambitious massively parallel computer at the time, the ILLIAC IV was plagued with design and production problems. Once finally completed, it achieved a computational speed of 200 million instructions per second and 1 billion bits per second of I/O transfer via a unique combination of its parallel architecture and the overlapping or "pipelining" structure of its 64 processing elements.

RCA announces its Spectra series of computers

research the history of computer

Image from RCA Spectra-70 brochure

The first large commercial computers to use integrated circuits, RCA highlights the IC's advantage over IBM’s custom SLT modules. Spectra systems were marketed on the basis of their compatibility with the IBM System/360 series of computer since it implemented the IBM 360 instruction set and could run most IBM software with little or no modification.

Apollo Guidance Computer (AGC) makes its debut

research the history of computer

DSKY interface for the Apollo Guidance Computer

Designed by scientists and engineers at MIT’s Instrumentation Laboratory, the Apollo Guidance Computer (AGC) is the culmination of years of work to reduce the size of the Apollo spacecraft computer from the size of seven refrigerators side-by-side to a compact unit weighing only 70 lbs. and taking up a volume of less than 1 cubic foot. The AGC’s first flight was on Apollo 7. A year later, it steered Apollo 11 to the lunar surface. Astronauts communicated with the computer by punching two-digit codes into the display and keyboard unit (DSKY). The AGC was one of the earliest uses of integrated circuits, and used core memory, as well as read-only magnetic rope memory. The astronauts were responsible for entering more than 10,000 commands into the AGC for each trip between Earth and the Moon.

Data General Corporation introduces the Nova Minicomputer

research the history of computer

Edson deCastro with a Data General Nova

Started by a group of engineers that left Digital Equipment Corporation (DEC), Data General designs the Nova minicomputer. It had 32 KB of memory and sold for $8,000. Ed de Castro, its main designer and co-founder of Data General, had earlier led the team that created the DEC PDP-8. The Nova line of computers continued through the 1970s, and influenced later systems like the Xerox Alto and Apple 1.

Amdahl Corporation introduces the Amdahl 470

research the history of computer

Gene Amdahl with 470V/6 model

Gene Amdahl, father of the IBM System/360, starts his own company, Amdahl Corporation, to compete with IBM in mainframe computer systems. The 470V/6 was the company’s first product and ran the same software as IBM System/370 computers but cost less and was smaller and faster.

First Kenbak-1 is sold

research the history of computer

One of the earliest personal computers, the Kenbak-1 is advertised for $750 in Scientific American magazine. Designed by John V. Blankenbaker using standard medium-- and small-scale integrated circuits, the Kenbak-1 relied on switches for input and lights for output from its 256-byte memory. In 1973, after selling only 40 machines, Kenbak Corporation closed its doors.

Hewlett-Packard introduces the HP-35

research the history of computer

HP-35 handheld calculator

Initially designed for internal use by HP employees, co-founder Bill Hewlett issues a challenge to his engineers in 1971: fit all of the features of their desktop scientific calculator into a package small enough for his shirt pocket. They did. Marketed as “a fast, extremely accurate electronic slide rule” with a solid-state memory similar to that of a computer, the HP-35 distinguished itself from its competitors by its ability to perform a broad variety of logarithmic and trigonometric functions, to store more intermediate solutions for later use, and to accept and display entries in a form similar to standard scientific notation. The HP-35 helped HP become one of the most dominant companies in the handheld calculator market for more than two decades.

Intel introduces the first microprocessor

research the history of computer

Advertisement for Intel's 4004

Computer History Museum

The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News. Developed for Busicom, a Japanese calculator maker, the 4004 had 2250 transistors and could perform up to 90,000 operations per second in four-bit chunks. Federico Faggin led the design and Ted Hoff led the architecture.

Laser printer invented at Xerox PARC

research the history of computer

Dover laser printer

Xerox PARC physicist Gary Starkweather realizes in 1967 that exposing a copy machine’s light-sensitive drum to a paper original isn’t the only way to create an image. A computer could “write” it with a laser instead. Xerox wasn’t interested. So in 1971, Starkweather transferred to Xerox Palo Alto Research Center (PARC), away from corporate oversight. Within a year, he had built the world’s first laser printer, launching a new era in computer printing, generating billions of dollars in revenue for Xerox. The laser printer was used with PARC’s Alto computer, and was commercialized as the Xerox 9700.

IBM SCAMP is developed

research the history of computer

Dr. Paul Friedl with SCAMP prototype

Under the direction of engineer Dr. Paul Friedl, the Special Computer APL Machine Portable (SCAMP) personal computer prototype is developed at IBM's Los Gatos and Palo Alto, California laboratories. IBM’s first personal computer, the system was designed to run the APL programming language in a compact, briefcase-like enclosure which comprised a keyboard, CRT display, and cassette tape storage. Friedl used the SCAMP prototype to gain approval within IBM to promote and develop IBM’s 5100 family of computers, including the most successful, the 5150, also known as the IBM Personal Computer (PC), introduced in 1981. From concept to finished system, SCAMP took only six months to develop.

Micral is released

research the history of computer

Based on the Intel 8008 microprocessor, the Micral is one of the earliest commercial, non-kit personal computers. Designer Thi Truong developed the computer while Philippe Kahn wrote the software. Truong, founder and president of the French company R2E, created the Micral as a replacement for minicomputers in situations that did not require high performance, such as process control and highway toll collection. Selling for $1,750, the Micral never penetrated the U.S. market. In 1979, Truong sold R2E to Bull.

The TV Typewriter plans are published

research the history of computer

TV Typewriter

Designed by Don Lancaster, the TV Typewriter is an easy-to-build kit that can display alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of hobbyist magazine Radio Electronics . The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A cassette tape interface provided supplementary storage for text. The TV Typewriter was used by many small television stations well in the 1990s.

Wang Laboratories releases the Wang 2200

research the history of computer

Wang was a successful calculator manufacturer, then a successful word processor company. The 1973 Wang 2200 makes it a successful computer company, too. Wang sold the 2200 primarily through Value Added Resellers, who added special software to solve specific customer problems. The 2200 used a built-in CRT, cassette tape for storage, and ran the programming language BASIC. The PC era ended Wang’s success, and it filed for bankruptcy in 1992.

Scelbi advertises its 8H computer

research the history of computer

The first commercially advertised US computer based on a microprocessor (the Intel 8008,) the Scelbi has 4 KB of internal memory and a cassette tape interface, as well as Teletype and oscilloscope interfaces. Scelbi aimed the 8H, available both in kit form and fully assembled, at scientific, electronic, and biological applications. In 1975, Scelbi introduced the 8B version with 16 KB of memory for the business market. The company sold about 200 machines, losing $500 per unit.

The Mark-8 appears in the pages of Radio-Electronics

research the history of computer

Mark-8 featured on Radio-Electronics July 1974 cover

The Mark-8 “Do-It-Yourself” kit is designed by graduate student John Titus and uses the Intel 8008 microprocessor. The kit was the cover story of hobbyist magazine Radio-Electronics in July 1974 – six months before the MITS Altair 8800 was in rival Popular Electronics magazine. Plans for the Mark-8 cost $5 and the blank circuit boards were available for $50.

Xerox PARC Alto introduced

research the history of computer

The Alto is a groundbreaking computer with wide influence on the computer industry. It was based on a graphical user interface using windows, icons, and a mouse, and worked together with other Altos over a local area network. It could also share files and print out documents on an advanced Xerox laser printer. Applications were also highly innovative: a WYSIWYG word processor known as “Bravo,” a paint program, a graphics editor, and email for example. Apple’s inspiration for the Lisa and Macintosh computers came from the Xerox Alto.

MITS Altair 8800 kit appears in Popular Electronics

research the history of computer

Altair 8800

For its January issue, hobbyist magazine Popular Electronics runs a cover story of a new computer kit – the Altair 8800. Within weeks of its appearance, customers inundated its maker, MITS, with orders. Bill Gates and Paul Allen licensed their BASIC programming language interpreter to MITS as the main language for the Altair. MITS co-founder Ed Roberts invented the Altair 8800 — which sold for $297, or $395 with a case — and coined the term “personal computer”. The machine came with 256 bytes of memory (expandable to 64 KB) and an open 100-line bus structure that evolved into the “S-100” standard widely used in hobbyist and personal computers of this era. In 1977, MITS was sold to Pertec, which continued producing Altairs in 1978.

MOS 6502 is introduced

research the history of computer

MOS 6502 ad from IEEE Computer, Sept. 1975

Chuck Peddle leads a small team of former Motorola employees to build a low-cost microprocessor. The MOS 6502 was introduced at a conference in San Francisco at a cost of $25, far less than comparable processors from Intel and Motorola, leading some attendees to believe that the company was perpetrating a hoax. The chip quickly became popular with designers of early personal computers like the Apple II and Commodore PET, as well as game consoles like the Nintendo Entertainment System. The 6502 and its progeny are still used today, usually in embedded applications.

Southwest Technical Products introduces the SWTPC 6800

research the history of computer

Southwest Technical Products 6800

Southwest Technical Products is founded by Daniel Meyer as DEMCO in the 1960s to provide a source for kit versions of projects published in electronics hobbyist magazines. SWTPC introduces many computer kits based on the Motorola 6800, and later, the 6809. Of the dozens of different SWTP kits available, the 6800 proved the most popular.

Tandem Computers releases the Tandem-16

research the history of computer

Dual-processor Tandem 16 system

Tailored for online transaction processing, the Tandem-16 is one of the first commercial fault-tolerant computers. The banking industry rushed to adopt the machine, built to run during repair or expansion. The Tandem-16 eventually led to the “Non-Stop” series of systems, which were used for early ATMs and to monitor stock trades.

The Video Display Module (VDM-1)

research the history of computer

The Video Display Module (VDM)

Designed by computer pioneer Lee Felsenstein, the Video Display Module (VDM-1) marks the earliest implementation of a memory-mapped alphanumeric video display for personal computers. Introduced at the Altair Convention in Albuquerque in March 1976, it was a much-needed input device for hobbyists building their own microcomputer systems at the time and became the basis of the SOL-20 computer.

Cray-1 supercomputer introduced

research the history of computer

Cray I 'Self-portrait'

The fastest machine of its day, The Cray-1's speed comes partly from its shape, a "C," which reduces the length of wires and thus the time signals need to travel across them. High packaging density of integrated circuits and a novel Freon cooling system also contributed to its speed. Each Cray-1 took a full year to assemble and test and cost about $10 million. Typical applications included US national defense work, including the design and simulation of nuclear weapons, and weather forecasting.

Intel 8080 and Zilog Z-80

research the history of computer

Zilgo Z-80 microprocessor

Image by Gennadiy Shvets

Intel and Zilog introduced new microprocessors. Five times faster than its predecessor, the 8008, the Intel 8080 could address four times as many bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program written for the 8080 and included twice as many built-in machine instructions.

Steve Wozniak completes the Apple-1

research the history of computer

Designed by Sunnyvale, California native Steve Wozniak, and marketed by his friend Steve Jobs, the Apple-1 is a single-board computer for hobbyists. With an order for 50 assembled systems from Mountain View, California computer store The Byte Shop in hand, the pair started a new company, naming it Apple Computer, Inc. In all, about 200 of the boards were sold before Apple announced the follow-on Apple II a year later as a ready-to-use computer for consumers, a model which sold in the millions for nearly two decades.

Apple II introduced

research the history of computer

Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout , the Apple-II finds popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.

Tandy Radio Shack introduces its TRS-80

research the history of computer

Performing far better than the company projections of 3,000 units for the first year, in the first month after its release Tandy Radio Shack´s first desktop computer — the TRS-80 — sells 10,000 units. The TRS-80 was priced at $599.95, included a Z80 microprocessor, video display, 4 KB of memory, a built-in BASIC programming language interpreter, cassette storage, and easy-to-understand manuals that assumed no prior knowledge on the part of the user. The TRS-80 proved popular with schools, as well as for home use. The TRS-80 line of computers later included color, portable, and handheld versions before being discontinued in the early 1990s.

The Commodore PET (Personal Electronic Transactor) introduced

research the history of computer

Commodore PET

The first of several personal computers released in 1977, the PET comes fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a 'chiclet' keyboard. The PET was popular with schools and for use as a home computer. It used a MOS Technologies 6502 microprocessor running at 1 MHz. After the success of the PET, Commodore remained a major player in the personal computer market into the 1990s.

The DEC VAX introduced

research the history of computer

DEC VAX 11/780

Beginning with the VAX-11/780, the Digital Equipment Corporation (DEC) VAX family of computers rivals much more expensive mainframe computers in performance and features the ability to address over 4 GB of virtual memory, hundreds of times the capacity of most minicomputers. Called a “complex instruction set computer,” VAX systems were backward compatible and so preserved the investment owners of previous DEC computers had in software. The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research.

Atari introduces its Model 400 and 800 computers

research the history of computer

Early Atari 400/800 advertisement

Shortly after delivery of the Atari VCS game console, Atari designs two microcomputers with game capabilities: the Model 400 and Model 800. The 400 served primarily as a game console, while the 800 was more of a home computer. Both faced strong competition from the Apple II, Commodore PET, and TRS-80 computers. Atari's 8-bit computers were influential in the arts, especially in the emerging DemoScene culture of the 1980s and '90s.

Motorola introduces the 68000 microprocessor

research the history of computer

Die shot of Motorola 68000

Image by Pauli Rautakorpi

The Motorola 68000 microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering.

Texas Instruments TI 99/4 is released

research the history of computer

Texas Instruments TI 99/4 microcomputer

Based around the Texas Instruments TMS 9900 microprocessor running at 3 MHz, the TI 99/4 has one of the fastest CPUs available in a home computer. The TI99/4 had a wide variety of expansion boards, with an especially popular speech synthesis system that could also be used with TI's Speak & Spell educational game. The TI 99/4 sold well and led to a series of TI follow-on machines.

Commodore introduces the VIC-20

research the history of computer

Commodore VIC-20

Commodore releases the VIC-20 home computer as the successor to the Commodore PET personal computer. Intended to be a less expensive alternative to the PET, the VIC-20 was highly successful, becoming the first computer to sell more than a million units. Commodore even used Star Trek television star William Shatner in advertisements.

The Sinclair ZX80 introduced

research the history of computer

Sinclair ZX80

This very small home computer is available in the UK as a kit for £79 or pre-assembled for £99. Inside was a Z80 microprocessor and a built-in BASIC language interpreter. Output was displayed on the user’s home TV screen through use of an adapter. About 50,000 were sold in Britain, primarily to hobbyists, and initially there was a long waiting list for the system.

The Computer Programme debuts on the BBC

research the history of computer

Title card- BBC’s The Computer Programme

The British Broadcasting Corporation’s Computer Literacy Project hoped “to introduce interested adults to the world of computers.” Acorn produces a popular computer, the BBC Microcomputer System, so viewers at home could follow along on their own home computers as they watched the program. The machine was expandable, with ports for cassette storage, serial interface and rudimentary networking. A large amount of software was created for the “BBC Micro,” including educational, productivity, and game programs.

Apollo Computer unveils its first workstation, its DN100

research the history of computer

Apollo DN100

The DN100 is based on the Motorola 68000 microprocessor, high-resolution display and built-in networking - the three basic features of all workstations. Apollo and its main competitor, Sun Microsystems, optimized their machines to run the computer-intensive graphics programs common in engineering and scientific applications. Apollo was a leading innovator in the workstation field for more than a decade, and was acquired by Hewlett-Packard in 1989.

IBM introduces its Personal Computer (PC)

research the history of computer

IBM's brand recognition, along with a massive marketing campaign, ignites the fast growth of the personal computer market with the announcement of its own personal computer (PC). The first IBM PC, formally known as the IBM Model 5150, was based on a 4.77 MHz Intel 8088 microprocessor and used Microsoft´s MS-DOS operating system. The IBM PC revolutionized business computing by becoming the first PC to gain widespread adoption by industry. The IBM PC was widely copied (“cloned”) and led to the creation of a vast “ecosystem” of software, peripherals, and other commodities for use with the platform.

Osborne 1 introduced

research the history of computer

Weighing 24 pounds and costing $1,795, the Osborne 1 is the first mass-produced portable computer. Its price was especially attractive as the computer included very useful productivity software worth about $1,500 alone. It featured a 5-inch display, 64 KB of memory, a modem, and two 5.25-inch floppy disk drives.

Commodore introduces the Commodore 64

research the history of computer

Commodore 64 system

The C64, as it is better known, sells for $595, comes with 64 KB of RAM and features impressive graphics. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in 1993, it had sold more than 22 million units. It is recognized by the 2006 Guinness Book of World Records as the greatest selling single computer of all time.

Franklin releases Apple II “clones”

research the history of computer

Franklin Ace 100 microcomputer

Created almost five years after the original Apple II, Franklin's Ace 1000 main logic board is nearly identical to that in the Apple II+ computer, and other models were later cloned as well. Franklin was able to undercut Apple's pricing even while offering some features not available on the original. Initially, Franklin won a court victory allowing them to continue cloning the machines, but in 1988, Apple won a copyright lawsuit against Franklin, forcing them to stop making Apple II “clones.”

Sun Microsystems is founded

research the history of computer

Sun-1 workstation

When Xerox PARC loaned the Stanford Engineering Department an entire Alto Ethernet network with laser printer, graduate student Andy Bechtolsheim re-designed it into a prototype that he then attached to Stanford’s computer network. Sun Microsystems grows out of this prototype. The roots of the company’s name came from the acronym for Stanford University Network (SUN). The company was incorporated by three 26-year-old Stanford alumni: Bechtolsheim, Vinod Khosla and Scott McNealy. The trio soon attracted UC Berkeley UNIX guru Bill Joy, who led software development. Sun helped cement the model of a workstation having an Ethernet interface as well as high-resolution graphics and the UNIX operating system.

Apple introduces the Lisa computer

research the history of computer

Lisa is the first commercial personal computer with a graphical user interface (GUI). It was thus an important milestone in computing as soon Microsoft Windows and the Apple Macintosh would soon adopt the GUI as their user interface, making it the new paradigm for personal computing. The Lisa ran on a Motorola 68000 microprocessor and came equipped with 1 MB of RAM, a 12-inch black-and-white monitor, dual 5.25-inch floppy disk drives and a 5 MB “Profile” hard drive. Lisa itself, and especially its GUI, were inspired by earlier work at the Xerox Palo Alto Research Center.

Compaq Computer Corporation introduces the Compaq Portable

research the history of computer

Compaq Portable

Advertised as the first 100% IBM PC-compatible computer, the Compaq Portable can run the same software as the IBM PC. With the success of the clone, Compaq recorded first-year sales of $111 million, the most ever by an American business in a single year. The success of the Portable inspired many other early IBM-compatible computers. Compaq licensed the MS-DOS operating system from Microsoft and legally reverse-engineered IBM’s BIOS software. Compaq's success launched a market for IBM-compatible computers that by 1996 had achieved an 83-percent share of the personal computer market.

Apple Computer launches the Macintosh

research the history of computer

Apple Macintosh

Apple introduces the Macintosh with a television commercial during the 1984 Super Bowl, which plays on the theme of totalitarianism in George Orwell´s book 1984 . The ad featured the destruction of “Big Brother” – a veiled reference to IBM -- through the power of personal computing found in a Macintosh. The Macintosh was the first successful mouse-driven computer with a graphical user interface and was based on the Motorola 68000 microprocessor. Its price was $2,500. Applications that came as part of the package included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG (What You See Is What You Get) word processing.

IBM releases its PC Jr. and PC/AT

research the history of computer

The PC Jr. is marketed as a home computer but is too expensive and limited in performance to compete with many of the other machines in that market. It’s “chiclet” keyboard was also criticized for poor ergonomics. While the PC Jr. sold poorly, the PC/AT sold in the millions. It offered increased performance and storage capacity over the original IBM PC and sold for about $4,000. It also included more memory and accommodated high-density 1.2-megabyte 5 1/4-inch floppy disks.

PC's Limited is founded

research the history of computer

PC’s Limited founder Michael Dell

In 1984, Michael Dell creates PC's Limited while still a student of the University of Texas at Austin. The dorm-room headquartered company sold IBM PC-compatible computers built from stock components. Dell dropped out of school to focus on his business and in 1985, the company produced the first computer of its own design, the Turbo PC, which sold for $795. By the early 1990s, Dell became one of the leading computer retailers.

The Amiga 1000 is released

research the history of computer

Music composition on the Amiga 1000

Commodore’s Amiga 1000 is announced with a major event at New York's Lincoln Center featuring celebrities like Andy Warhol and Debbie Harry of the musical group Blondie. The Amiga sold for $1,295 (without monitor) and had audio and video capabilities beyond those found in most other personal computers. It developed a very loyal following while add-on components allowed it to be upgraded easily. The inside of the Amiga case is engraved with the signatures of the Amiga designers, including Jay Miner as well as the paw print of his dog Mitchy.

Compaq introduces the Deskpro 386 system

research the history of computer

Promotional shot of the Compaq Deskpro 386s,

Compaq beats IBM to the market when it announces the Deskpro 386, the first computer on the market to use Intel´s new 80386 chip, a 32-bit microprocessor with 275,000 transistors on each chip. At 4 million operations per second and 4 kilobytes of memory, the 80386 gave PCs as much speed and power as older mainframes and minicomputers.

The 386 chip brought with it the introduction of a 32-bit architecture, a significant improvement over the 16-bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology. The new chip made graphical operating environments for IBM PC and PC-compatible computers practical. The architecture that allowed Windows and IBM OS/2 has remained in subsequent chips.

IBM releases the first commercial RISC-based workstation

research the history of computer

Reduced instruction set computers (RISC) grow out of the observation that the simplest 20 percent of a computer´s instruction set does 80 percent of the work. The IBM PC-RT had 1 MB of RAM, a 1.2-megabyte floppy disk drive, and a 40 MB hard drive. It performed 2 million instructions per second, but other RISC-based computers worked significantly faster.

The Connection Machine is unveiled

research the history of computer

Connection Machine CM-1

Daniel Hillis of Thinking Machines Corporation moves artificial intelligence a step forward when he develops the controversial concept of massive parallelism in the Connection Machine CM-1. The machine used up to 65,536 one-bit processors and could complete several billion operations per second. Each processor had its own small memory linked with others through a flexible network that users altered by reprogramming rather than rewiring. The machine´s system of connections and switches let processors broadcast information and requests for help to other processors in a simulation of brain-like associative recall. Using this system, the machine could work faster than any other at the time on a problem that could be parceled out among the many processors.

Acorn Archimedes is released

research the history of computer

Acorn Archimedes microcomputer

Acorn's ARM RISC microprocessor is first used in the company's Archimedes computer system. One of Britain's leading computer companies, Acorn continued the Archimedes line, which grew to nearly twenty different models, into the 1990s. Acorn spun off ARM as its own company to license microprocessor designs, which in turn has transformed mobile computing with ARM’s low power, high-performance processors and systems-on-chip (SoC).

IBM introduces its Personal System/2 (PS/2) machines

research the history of computer

The first IBM system to include Intel´s 80386 chip, the company ships more than 1 million units by the end of the first year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBM PCs for the first time. Many credit the PS/2 for making the 3.5-inch floppy disk drive and video graphics array (VGA) standard for IBM computers. The system was IBM's response to losing control of the PC market with the rise of widespread copying of the original IBM PC design by “clone” makers.

Apple co-founder Steve Jobs unveils the NeXT Cube

research the history of computer

Steve Jobs, forced out of Apple in 1985, founds a new company – NeXT. The computer he created, an all-black cube was an important innovation. The NeXT had three Motorola microprocessors and 8 MB of RAM. Its base price was $6,500. Some of its other innovations were the inclusion of a magneto-optical (MO) disk drive, a digital signal processor and the NeXTSTEP programming environment (later released as OPENSTEP). This object-oriented multitasking operating system was groundbreaking in its ability to foster rapid development of software applications. OPENSTEP was used as one of the foundations for the new Mac OS operating system soon after NeXT was acquired by Apple in 1996.

Laser 128 is released

research the history of computer

Laser 128 Apple II clone

VTech, founded in Hong Kong, had been a manufacturer of Pong-like games and educational toys when they introduce the Laser 128 computer. Instead of simply copying the basic input output system (BIOS) of the Apple II as Franklin Computer had done, they reversed engineered the system and sold it for US $479, a much lower price than the comparable Apple II. While Apple sued to remove the Laser 128 from the market, they were unsuccessful and the Laser remained one of the very few Apple “clones” for sale.

Intel introduces the 80486 microprocessor

research the history of computer

Intel 80486 promotional photo

Intel released the 80486 microprocessor and the i860 RISC/coprocessor chip, each of which contained more than 1 million transistors. The RISC microprocessor had a 32-bit integer arithmetic and logic unit (the part of the CPU that performs operations such as addition and subtraction), a 64-bit floating-point unit, and a clock rate of 33 MHz.

The 486 chips remained similar in structure to their predecessors, the 386 chips. What set the 486 apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the 386 without increasing the clock rate.

Macintosh Portable is introduced

research the history of computer

Macintosh Portable

Apple had initially included a handle in their Macintosh computers to encourage users to take their Macs on the go, though not until five years after the initial introduction does Apple introduce a true portable computer. The Macintosh Portable was heavy, weighing sixteen pounds, and expensive (US$6,500). Sales were weaker than projected, despite being widely praised by the press for its active matrix display, removable trackball, and high performance. The line was discontinued less than two years later.

Intel's Touchstone Delta supercomputer system comes online

research the history of computer

Intel Touchstone Delta supercomputer

Reaching 32 gigaflops (32 billion floating point operations per second), Intel’s Touchstone Delta has 512 processors operating independently, arranged in a two-dimensional communications “mesh.” Caltech researchers used this supercomputer prototype for projects such as real-time processing of satellite images, and for simulating molecular models in AIDS research. It would serve as the model for several other significant multi-processor systems that would be among the fastest in the world.

Babbage's Difference Engine #2 is completed

research the history of computer

The Difference Engine #2 at the Science Museum, London

Based on Charles Babbage's second design for a mechanical calculating engine, a team at the Science Museum in London sets out to prove that the design would have worked as planned. Led by curator Doron Swade the team built Babbage’s machine in six years, using techniques that would have been available to Babbage at the time, proving that Babbage’s design was accurate and that it could have been built in his day.

PowerBook series of laptops is introduced

research the history of computer

PowerBook 100 laptop computer

Apple's Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple's line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006.

DEC announces Alpha chip architecture

research the history of computer

DEC Alpha chip die-shot

Designed to replace the 32-bit VAX architecture, the Alpha is a 64-bit reduced instruction set computer (RISC) microprocessor. It was widely used in DEC's workstations and servers, as well as several supercomputers like the Chinese Sunway Blue Light system, and the Swiss Gigabooster. The Alpha processor designs were eventually acquired by Compaq, which, along with Intel, phased out the Alpha architecture in favor of the HP/Itanium microprocessor.

Intel Paragon is operational

research the history of computer

Intel Paragon system

Based on the Touchstone Delta computer Intel had built at Caltech, the Paragon is a parallel supercomputer that uses 2,048 (later increased to more than four thousand) Intel i860 processors. More than one hundred Paragons were installed over the lifetime of the system, each costing as much as five million dollars. The Paragon at Caltech was named the fastest supercomputer in the world in 1992. Paragon systems were used in many scientific areas, including atmospheric and oceanic flow studies, and energy research.

Apple ships the first Newton

research the history of computer

The Apple Newton Personal Digital Assistant

Apple enters the handheld computer market with the Newton. Dubbed a “Personal Digital Assistant” by Apple President John Sculley in 1992, the Newton featured many of the features that would define handheld computers in the following decades. The handwriting recognition software was much maligned for inaccuracy. The Newton line never performed as well as hoped and was discontinued in 1998.

Intel's Pentium microprocessor is released

research the history of computer

HP Netserver LM, one of the first to use Intel's Pentium

The Pentium is the fifth generation of the ‘x86’ line of microprocessors from Intel, the basis for the IBM PC and its clones. The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.

RISC PC is released

research the history of computer

Acorn RISC PC

Replacing their Archimedes computer, the RISC PC from UK's Acorn Computers uses the ARMv3 RISC microprocessor. Though it used a proprietary operating system, RISC OS, the RISC PC could run PC-compatible software using the Acorn PC Card. The RISC PC was used widely in UK broadcast television and in music production.

BeBox is released

research the history of computer

BeBox computer

Be, founded by former Apple executive Jean Louis Gassée and a number of former Apple, NeXT and SUN employees, releases their only product – the BeBox. Using dual PowerPC 603 CPUs, and featuring a large variety of peripheral ports, the first devices were used for software development. While it did not sell well, the operating system, Be OS, retained a loyal following even after Be stopped producing hardware in 1997 after less than 2,000 machines were produced.

IBM releases the ThinkPad 701C

research the history of computer

IBM ThinkPad 701C

Officially known as the Track Write, the automatically expanding full-sized keyboard used by the ThinkPad 701 is designed by inventor John Karidis. The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened -- resulting in a keyboard significantly wider than the case. This keyboard design was dubbed “the Butterfly.” The need for such a design was lessened as laptop screens grew wider.

Palm Pilot is introduced

research the history of computer

Ed Colligan, Donna Dubinsky, and Jeff Hawkins

Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, originally created software for the Casio Zoomer personal data assistant. The first generation of Palm-produced devices, the Palm 1000 and 5000, are based around a Motorola microprocessor running at 16MHz, and uses a special gestural input language called “Graffiti,” which is quick to learn and fast. Palm could be connected to a PC or Mac using a serial port to synchronize – “sync” – both computer and Palm. The company called it a ‘connected organizer’ rather than a PDA to emphasize this ability.

Sony Vaio series is begun

research the history of computer

Sony Vaio laptop

Sony had manufactured and sold computers in Japan, but the VAIO signals their entry into the global computer market. The first VAIO, a desktop computer, featured an additional 3D interface on top of the Windows 95 operating system as a way of attracting new users. The VAIO line of computers would be best known for laptops were designed with communications and audio-video capabilities at the forefront, including innovative designs that incorporated TV and radio tuners, web cameras, and handwriting recognition. The line was discontinued in 2014.

ASCI Red is operational

research the history of computer

ASCI Red supercomputers

The Advanced Strategic Computing Initiative (ASCI) needed a supercomputer to help with the maintenance of the US nuclear arsenal following the ban on underground nuclear testing. The ASCI Red, based on the design of the Intel Paragon, was built by IBM and delivered to Sandia National Laboratories. Until the year 2000, it was the world's fastest supercomputer, able to achieve peak performance of 1.3 teraflops, (about 1.3 trillion calculations per second).

Linux-based Supercomputing

research the history of computer

Linux Supercomputer

The first supercomputer using the Linux operating system, consumer, off-the shelf parts, and a high-speed, low-latency interconnection network, was developed by David A. Bader while at the University of New Mexico. From this successful prototype design, Bader led the development of “RoadRunner”, the first Linux supercomputer for open use by the national science and engineering community via the National Science Foundation's National Technology Grid. RoadRunner was put into production use in April 1999. Within a decade this design became the predominant architecture for all major supercomputers in the world.

The iMac, a range of all-in-one Macintosh desktop computers, is launched

research the history of computer

iMac poster

Apple makes a splash with its Bondi Blue iMac, which sells for about $1,300. Customers got a machine with a 233-MHz G3 processor, 4GB hard drive, 32MB of RAM, a CD-ROM drive, and a 15" monitor. The machine was noted for its ease-of-use and included a 'manual' that contained only a few pictures and less than 20 words. As Apple’s first new product under the leadership of a returning Steve Jobs, many consider this the most significant step in Apple's return from near-bankruptcy in the middle 1990s.

First camera phone introduced

research the history of computer

Sony-built J-Phone J-SH04

Japan's SoftBank introduces the first camera phone, the J-Phone J-SH04; a Sharp-manufactured digital phone with integrated camera. The camera had a maximum resolution of 0.11 megapixels a 256-color display, and photos could be shared wirelessly. The J-Phone line would quickly expand, releasing a flip-phone version just a month later. Cameras would become a significant part of most phones within a year, and several countries have even passed laws regulating their use.

Earth Simulator is world's fastest supercomputer

research the history of computer

Earth Simulator Supercomputer

Developed by the Japanese government to create global climate models, the Earth Simulator is a massively parallel, vector-based system that costs nearly 60 billion yen (roughly $600 million at the time). A consortium of aerospace, energy, and marine science agencies undertook the project, and the system was built by NEC around their SX-6 architecture. To protect it from earthquakes, the building housing it was built using a seismic isolation system that used rubber supports. The Earth Simulator was listed as the fastest supercomputer in the world from 2002 to 2004.

Handspring Treo is released

research the history of computer

Colligan, Dubinsky, Hawkins (left to right)

Leaving Palm Inc., Ed Colligan, Donna Dubinsky, and Jeff Hawkins found Handspring. After retiring their initial Visor series of PDAs, Handspring introduced the Treo line of smartphones, designed with built-in keyboards, cameras, and the Palm operating system. The Treo sold well, and the line continued until Handspring was purchased by Palm in 2003.

PowerMac G5 is released

research the history of computer

PowerMac G5 tower computer

With a distinctive anodized aluminum case, and hailed as the first true 64-bit personal computer, the Apple G5 is the most powerful Macintosh ever released to that point. While larger than the previous G4 towers, the G5 had comparatively limited space for expansion. Virginia Tech used more than a thousand PowerMac G5s to create the System X cluster supercomputer, rated #3 in November of that year on the world’s TOP500 fastest computers.

research the history of computer

Arduino starter kit

Harkening back to the hobbyist era of personal computing in the 1970s, Arduino begins as a project of the Interaction Design Institute, Ivrea, Italy. Each credit card-sized Arduino board consisted of an inexpensive microcontroller and signal connectors which made Arduinos ideal for use in any application connecting to or monitoring the outside world. The Arduino used a Java-based integrated development environment and users could access a library of programs, called “Wiring,” that allowed for simplified programming. Arduino soon became the main computer platform of the worldwide “Maker” movement.

Lenovo acquires IBM's PC business

research the history of computer

IBM and Lenovo logos

Nearly a quarter century after IBM launched their PC in 1981, they had become merely another player in a crowded marketplace. Lenovo, China's largest manufacturer of PCs, purchased IBM's personal computer business in 2005, largely to gain access to IBM's ThinkPad line of computers and sales force. Lenovo became the largest manufacturer of PCs in the world with the acquisition, later also acquiring IBM's server line of computers.

NASA Ames Research Center supercomputer Columbia

research the history of computer

Columbia Supercomputer system made up of SGI Altix

Named in honor of the space shuttle which broke-up on re-entry, the Columbia supercomputer is an important part of NASA's return to manned spaceflight after the 2003 disaster. Columbia was used in space vehicle analysis, including studying the Columbia disaster, but also in astrophysics, weather and ocean modeling. At its introduction, it was listed as the second fastest supercomputer in the world and this single system increased NASA's supercomputing capacity 10-fold. The system was kept at NASA Ames Research Center until 2013, when it was removed to make way for two new supercomputers.

One Laptop Per Child initiative begins

research the history of computer

OLPC XO laptop computer

At the 2006 World Economic Forum in Davos, Switzerland, the United Nations Development Program (UNDP) announces it will create a program to deliver technology and resources to targeted schools in the least developed countries. The project became the One Laptop per Child Consortium (OLPC) founded by Nicholas Negroponte, the founder of MIT's Media Lab. The first offering to the public required the buyer to purchase one to be given to a child in the developing world as a condition of acquiring a machine for themselves. By 2011, over 2.4 million laptops had been shipped.

The Amazon Kindle is released

research the history of computer

Amazon Kindle

Many companies have attempted to release electronic reading systems dating back to the early 1990s. Online retailer Amazon released the Kindle, one of the first to gain a large following among consumers. The first Kindle featured wireless access to content via Amazon.com, along with an SD card slot allowing increased storage. The first release proved so popular there was a long delay in delivering systems on release. Follow-on versions of the Kindle added further audio-video capabilities.

The Apple iPhone is released

research the history of computer

Apple iPhone

Apple launches the iPhone - a combination of web browser, music player and cell phone - which could download new functionality in the form of "apps" (applications) from the online Apple store. The touchscreen enabled smartphone also had built-in GPS navigation, high-definition camera, texting, calendar, voice dictation, and weather reports.

The MacBook Air is released

research the history of computer

Steve Jobs introducing MacBook Air

Apple introduces their first ultra notebook – a light, thin laptop with high-capacity battery. The Air incorporated many of the technologies that had been associated with Apple's MacBook line of laptops, including integrated camera, and Wi-Fi capabilities. To reduce its size, the traditional hard drive was replaced with a solid-state disk, the first mass-market computer to do so.

IBM's Roadrunner supercomputer is completed

research the history of computer

Computer-enhanced image of IBM’s Roadrunner

The Roadrunner is the first computer to reach a sustained performance of 1 petaflop (one thousand trillion floating point operations per second). It used two different microprocessors: an IBM POWER XCell L8i and AMD Opteron. It was used to model the decay of the US nuclear arsenal, analyze financial data, and render 3D medical images in real-time. An offshoot of the POWER XCell8i chip was used as the main processor in the Sony PlayStation 3 game console.

Jaguar Supercomputer at Oak Ridge upgraded

Originally a Cray XT3 system, the Jaguar is a massively parallel supercomputer at Oak Ridge National Laboratory, a US science and energy research facility. The system cost more than $100 million to create and ran a variation of the Linux operating system with up to 10 petabytes of storage. The Jaguar was used to study climate science, seismology, and astrophysics applications. It was the fastest computer in the world from November 2009 to June 2010.

Apple Retina Display

research the history of computer

Introduction of the iPhone 4 with retina display

Since the release of the Macintosh in 1984, Apple has placed emphasis on high-resolution graphics and display technologies. In 2012, Apple introduced the Retina display for the MacBook Pro laptop and iPad tablet. With a screen resolution of up to 400 pixels-per-inch (PPI), Retina displays approached the limit of pixel visibility to the human eye. The display also used In Plane Switching (IPS) technology, which allowed for a wider viewing angle and improved color accuracy. The Retina display became standard on most of the iPad, iPhone, MacBook, and Apple Watch product lines.

China's Tianhe supercomputers are operational

research the history of computer

Tianhe-1A Supercomputer

With a peak speed of over a petaflop (one thousand trillion calculations per second), the Tianhe 1 (translation: Milky Way 1) is developed by the Chinese National University of Defense Technology using Intel Xeon processors combined with AMD graphic processing units (GPUs). The upgraded and faster Tianhe-1A used Intel Xeon CPUs as well, but switched to nVidia's Tesla GPUs and added more than 2,000 Fei-Tang (SPARC-based) processors. The machines were used by the Chinese Academy of Sciences to run massive solar energy simulations, as well as some of the most complex molecular studies ever undertaken.

The Apple iPad is released

research the history of computer

Steve Jobs introducing the iPad

The iPad combines many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few.

IBM Sequoia is delivered to Lawrence Livermore Labs

Built by IBM using their Blue Gene/Q supercomputer architecture, the Sequoia system is the world's fastest supercomputer in 2012. Despite using 98,304 PowerPC chips, Sequoia's relatively low power usage made it unusually efficient. Scientific and defense applications included studies of human electrophysiology, nuclear weapon simulation, human genome mapping, and global climate change.

Nest Learning Thermostat is Introduced

research the history of computer

Nest Learning Thermostat

The Nest Learning Thermostat is an early product made for the emerging “Internet of Things,” which envisages a world in which common everyday devices have network connectivity and can exchange information or be controlled. The Nest allowed for remote access to a user’s home’s thermostat by using a smartphone or tablet and could also send monthly power consumption reports to help save on energy bills. The Nest would remember what temperature users preferred by ‘training’ itself to monitor daily use patterns for a few days then adopting that pattern as its new way of controlling home temperature.

Raspberry Pi, a credit-card-size single board computer, is released as a tool to promote science education

research the history of computer

Raspberry Pi computer

Conceived in the UK by the Raspberry Pi Foundation, this credit card-sized computer features ease of use and simplicity making it highly popular with students and hobbyists. In October 2013, the one millionth Raspberry Pi was shipped. Only one month later, another one million Raspberry Pis were delivered. The Pi weighed only 45 grams and initially sold for only $25-$35 U.S. Dollars.

University of Michigan Micro Mote is Completed

research the history of computer

The University of Michigan Micro Mote is the smallest computer in the world. The motes, measuring just over 1 cubic millimeter, were powered by a tiny battery and could collect sunlight through a photocell, enough to supply the tiny amount of energy a mote consumes -- about one trillionth of a watt. Three types of mote were initially introduced: two that measure temperature or pressure, and one that could take images. Since then, motes that recognize different sounds, measure light, track position, and record brain activity have been developed.

Motes are also known as "smart dust," since their tiny size and low cost make them inexpensive enough to "sprinkle" in the real world as sensors. An ecologist, for example, could sprinkle thousands of motes from the air onto a field and measure soil and air temperature, moisture, and sunlight, yielding accurate real-time data about the environment.

Apple Watch

research the history of computer

Apple Store’s display of newly introduced Apple Watches

Building a computer into the watch form factor has been attempted many times but the release of the Apple Watch leads to a new level of excitement. Incorporating a version of Apple's iOS operating system, as well as sensors for environmental and health monitoring, the Apple Watch was designed to be incorporated into the Apple environment with compatibility with iPhones and Mac Books. Almost a million units were ordered on the day of release. The Watch was received with great enthusiasm, but critics took issue with the somewhat limited battery life and high price.

Exhibit Design and Development Team

Exhibit content team.

IMAGES

  1. Short History of Computing

    research the history of computer

  2. History Of Computers: Timeline, I/O Devices and Networking

    research the history of computer

  3. The brief description of history of computer

    research the history of computer

  4. (PDF) History of the Computer

    research the history of computer

  5. A Brief History of Computers [Infographic]

    research the history of computer

  6. HISTORY OF COMPUTER- A Timeline|A Brief History of the Computer

    research the history of computer

VIDEO

  1. Master Computer Basics: History, Generations & Components

  2. History of Computer Hardware

  3. Generation of computer

  4. History of Computer in Pharmaceutical Research & Statistical Modelling|Computer Aided Drug Delivery

  5. Computer History / कंप्यूटर का इतिहास

  6. Computer

COMMENTS

  1. Computer

    The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer.

  2. History of computers: A brief timeline

    The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

  3. Computer

    A computer is a programmable device for processing, storing, and displaying information. Learn more in this article about modern digital electronic computers and their design, constituent parts, and applications as well as about the history of computing.

  4. Computers

    The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research.

  5. History of computers

    Computers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations ...

  6. History of Computers: A Brief Timeline

    History of Computers: A Brief Timeline Discover the fascinating history with our history of computers timeline, featuring key hardware breakthroughs from the earliest developments to recent innovations. Explore milestones such as Charles Babbage's Analytical Engine, ENIAC, the transistor's invention, the IBM PC's introduction, and the revolutionary impact of artificial intelligence. This ...

  7. History of computing

    The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.

  8. History of computer science

    The history of computer science began long before the modern discipline of computer science, usually appearing in forms like mathematics or physics. Developments in previous centuries alluded to the discipline that we now know as computer science. [ 1] This progression, from mechanical inventions and mathematical theories towards modern computer concepts and machines, led to the development of ...

  9. The Origin of Computing

    The Origin of Computing. The information age began with the realization that machines could emulate the power of minds. In the standard story, the computer's evolution has been brisk and short ...

  10. History Of Computers With Timeline [2023 Update]

    The history of computers began thousands of years ago with analog computing devices. Today, you won't believe what computers are capable of.

  11. The Modern History of Computing

    Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine, used increasingly from the 1920s, refers to any machine that does the work of a ...

  12. History Of Computers: Timeline, I/O Devices and Networking

    Explore the fascinating history of computers from earlier mechanical technology days to modern self-assist tech and build an expertise in your IT domain.

  13. History of Computers

    and S. Levy, Hackers: Heroes of the Computer Revolution, Doubleday & Co., 1984. Google Scholar Papers and articles dealing with the history of computing appear from time to time. AFIPS publishes a journal, the Annals of the History of Computing, and the ACM has sponsored R. L. Wexelblat (ed) A History of Programming Languages, Academic Press, 1981.

  14. The history of computing is both evolution and revolution

    The first in our series looking at the changes that have been made in computing and other areas in the 60 years since the first computer in an Australian university was switched on.

  15. A Short History of Computers

    The history of computers is weird and wonderful. What started as an abstract philosophical quest ended up setting the course for society for over a century and continues to be one of the most profound parts of modern life. The goal of this chapter is to trace an...

  16. A Look at the History of Computers

    Prior to the advent of microprocessors, a number of notable scientists and mathematicians helped lay the groundwork for the computers we use today.

  17. Invention of the PC

    Invention of the PC: Postwar Innovations ENIAC and other early computers proved to many universities and corporations that the machines were worth the tremendous investment of money, space and ...

  18. History of Computing

    Abstract. This chapter gives an overview of the history of computing science in hardware, software, and networking, covering prehistoric (prior to 1946) computing devices and computing pioneers since the Abacus. The emergency of social and ethical problems in computing is discussed via the history of computer crimes which started with the ...

  19. The History of Computing: A Very Short Introduction

    This book describes the central events, machines, and people in the history of computing, and traces how innovation has brought us from pebbles used for counting, to the modern age of the computer. It has a strong historiographical theme that offers a new perspective on how to understand the historical narratives we have constructed, and ...

  20. History of Computers

    In the year 1937, major changes began in the history of computers when Howard Aiken planned to develop a machine that could perform large calculations or calculations involving large numbers. In the year 1944, Mark I computer was built as a partnership between IBM and Harvard.

  21. History of Computers

    Charles Babbage is known as the Father of the modern computer (even though none of his computers worked or were even constructed in their entirety). He first designed plans to build, what he called the Automatic Difference Engine. It was designed to help in the construction of mathematical tables for navigation.

  22. Explore

    Explore Our Collection CHM stewards the world's leading collection and archive chronicling the history and impact of computing and technological innovation on the human experience. From machines to source code, lab notebooks to business plans, our collecting scope is expansive with unique perspectives into the online world, entrepreneurship, and software.

  23. History of computer and its generations.

    The history of computer dated back to the period of scientific revolution (i.e. 1543 - 1678). The calculating machine invented by Blaise Pascal in 1642 and. that of Goffried Liebnits marked the ...

  24. Unveiling the Evolution of Computers: A Journey Through Time

    Significance of Computers in Modern Society In the vast tapestry of human innovation and progress, few inventions have had as profound an impact as the computer. A marvel of human ingenuity, the computer stands as a testament to humanity's ceaseless pursuit of knowledge, problem-solving, and the quest for technological advancement. From its humble origins as rudimentary calculating devices to ...

  25. History of artificial intelligence

    The IBM 702: a computer used by the first generation of AI researchers. The earliest research into thinking machines was inspired by a confluence of ideas that became prevalent in the late 1930s, 1940s, and early 1950s. Recent research in neurology had shown that the brain was an electrical network of neurons that fired in all-or-nothing pulses.

  26. Buy the Dip or Stay Away? Top Analysts Weigh in on Super Micro Computer

    Super Micro Computer (NASDAQ:SMCI) shares had been on a stellar run, capitalizing on the AI boom and turning the company into one of the past year's biggest success stories. However, two events ...

  27. Evaluation Framework for Feedback Generation Methods in Skeletal

    The application of machine-learning solutions to movement assessment from skeleton videos has attracted significant research attention in recent years. This advancement has made rehabilitation at home more accessible, utilizing movement assessment algorithms that can operate on affordable equipment for human pose detection and analysis from 2D or 3D videos. While the primary objective of ...

  28. Computers

    The Institute of Advanced Study (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous mathematician John von Neumann. The notion of storing both data and instructions in memory became known as the 'stored program concept' to distinguish it from earlier methods of instructing a computer.