essay about invention of computer

Advertisement

Who Invented the First Computer?

  • Share Content on Facebook
  • Share Content on LinkedIn
  • Share Content on Flipboard
  • Share Content on Reddit
  • Share Content via Email

Charles Babbage

Key Takeaways

  • The concept of a computer dates back to Charles Babbage's mechanical difference and analytical engines, but the first electronic computer was the brainchild of Dr. John Vincent Atanasoff and his graduate student Clifford Berry, resulting in the Atanasoff-Berry Computer (ABC) by 1942.
  • World War II accelerated computer development, leading to machines like ENIAC for artillery calculations and Colossus for code-breaking; by 1951, the first commercial computer, UNIVAC, was built for the U.S. Census Bureau.
  • The evolution of personal computers began with prototypes like Hewlett-Packard's HP 9100A scientific calculator, Apple's Apple I and Apple II, and culminated in IBM's 5150 Personal Computer in 1981, which became a staple in businesses globally.

We could argue that the first computer was the abacus or its descendant, the slide rule, invented by William Oughtred in 1622. But many people consider English mathematician Charles Babbage's analytical engine to be the first computer resembling today's modern machines.

Before Babbage came along, a "computer" was a person, someone who literally sat around all day, adding and subtracting numbers and entering the results into tables. The tables then appeared in books, so other people could use them to complete tasks, such as launching artillery shells accurately or calculating taxes .

In fact, Babbage wrote that he was daydreaming over logarithmic tables during his time at Cambridge, sometime around 1812-1813, when he first imagined that a machine could do the job of a human computer. In July 1822, Babbage wrote a letter to the Royal Society proposing the idea that machines could do calculations based on a "method of differences." The Royal Society was intrigued and agreed to fund development on the idea. The first machine design that came out of these efforts was Babbage's first difference engine.

It was, in fact, a mammoth number-crunching project that inspired Babbage in the first place. In 1792 the French government had appointed Gaspard de Prony to supervise the creation of the Cadastre , a set of logarithmic and trigonometric tables. The French wanted to standardize measurements in the country and planned to use the tables to aid in those efforts to convert to the metric system. De Prony was in turn inspired by Adam Smith's famous work "Wealth of Nations." Smith wrote about how the division of labor improved efficiency when manufacturing pins. De Prony wanted to apply the division of labor to his mathematical project.

Unfortunately, once the 18 volumes of tables – with one more describing mathematical procedures – were complete, they were never published.

In 1819, Babbage visited the City of Light and viewed the unpublished manuscript with page after page of tables. If only, he wondered, there was a way to produce such tables faster, with less manpower and fewer mistakes. He thought of the many marvels generated by the Industrial Revolution. If creative and hardworking inventors could develop the cotton gin and the steam locomotive, then why not a machine to make calculations?

Babbage returned to England and decided to build just such a machine. His first vision was something he dubbed the difference engine , which worked on the principle of finite differences, or making complex mathematical calculations by repeated addition without using multiplication or division. He secured 1,500 pounds from the English government in 1823 and hired engineer Joseph Clement to begin construction on the difference engine.

Clement was a well-respected engineer and suggested improvements to Babbage, who allowed Clement to implement some of his ideas. Unfortunately, in 1833 the two had a falling out over the terms of their arrangement. Clement quit , ending his work on the difference engine.

But, as you might have guessed, the story doesn't end there.

Charles Babbage and the Analytical Engine

Who invented the first modern computer.

calculating machine,

By the time Clement packed up his tools, Babbage had already started thinking of an even grander idea — the analytical engine , a new kind of mechanical computer that could make even more complex calculations, including multiplication and division. The British government, however, cut his funding , which was, after all, intended to produce thedifference engine. The analytical engine is what so many people think of as the first computer.

The basic parts of the analytical engine resemble the components of any computer sold on the market today. It featured two hallmarks of any modern machine: a central processing unit or CPU and memory. Babbage, of course, didn't use those terms . He called the CPU the "mill." Memory was known as the "store." He also had a device — the "reader" — to input instructions, as well as a way to record, on paper, results generated by the machine. Babbage called this output device a printer, the precursor of inkjet and laser printers so common today.

Babbage's new invention existed almost entirely on paper. He kept voluminous notes and sketches about his computers — nearly 5,000 pages' worth — and although he never built a single production model of the analytical engine, he had a clear vision about how the machine would look and work. Borrowing the same technology used by the Jacquard loom , a weaving machine developed in 1804-05 that made it possible to create a variety of cloth patterns automatically, data would be entered on punched cards. Up to 1,000 50-digit numbers could be held in the computer's store. Punched cards would also carry the instructions, which the machine could execute out of sequential order. A single attendant would oversee the whole operation, but steam would power it, turning cranks, moving cams and rods and spinning gearwheels.

Unfortunately, the technology of the day couldn't deliver on Babbage's ambitious designs. It wasn't until 1991 that his particular ideas were finally translated into a functioning computer. That's when the Science Museum in London built, to Babbage's exact specifications, his difference engine. It stands 11 feet long and 7 feet tall (more than 3 meters long and 2 meters tall), contains 8,000 moving parts and weighs 5 tons (4.5 metric tons). A copy of the machine was built and shipped to the Computer History Museum in Mountain View, California, where it remained on display until December 2010. Neither device would function on a desktop, but they are no doubt the first computers and precursors to the modern PC . And those computers influenced the development of the World Wide Web .

If Charles Babbage was the genius behind the analytical engine, then Augusta Ada Byron, or by her married name Ada Lovelace , was the publicist (and, arguably, the very first computer programmer). She met Babbage at a party when she was 17 and became fascinated by the mathematician's computer engine. From that chance meeting grew a strong, dynamic relationship. Ada discussed Babbage's ideas with him and, because she was gifted in mathematics, offered her own insights. In 1843, she published an influential set of notes describing Babbage's analytical engine. Ada also added in some sage predictions, speculating that Babbage's mechanical computers might one day "act upon other things besides numbers" and "compose elaborate and scientific pieces of music of any degree of complexity."

Apple I

There are many differences between Babbage's difference and analytical engines and the machine sitting on your desktop now. Those machines are mechanical and yours is electronic. So, who invented the first electronic computer? As with most inventions, the digital computer was the work of many different people.

Like Babbage before him, Iowa State College (now Iowa State University) mathematics and physics professor Dr. John Vincent Atanasoff required a lot of computing power for his work. Even though he had one of the best calculators of its day, it still took a long time to do calculations. Also, like Babbage, Atanasoff wanted to see if he could do better. In 1937 he went for a drive to clear his mind and when he stopped for a drink, decided what kind of device he would build. His machine would use electricity. And rather than the base-10 standard, his computer would use the binary system our modern computers use.

Iowa State provided funding for the machine and Atanasoff hired an exceptionally talented graduate student Clifford Berry to help him realize his vision. They showed the prototype to Iowa State officials who then granted Atanasoff and Berry funding to build the real thing. By 1942 the Atanasoff-Berry Computer (or ABC) was ready

World War II spurred the creation of many new computers to solve specific problems. One was ENIAC , designed to compute artillery range tables. Another was Colossus , used to break German codes at Bletchley Park in the U.K. In 1949, the world's first practical stored-program computer, the EDSAC , entered history. Unlike earlier computers which were built to perform one specific task, the EDSAC could do multiple tasks. In the early 1950s, engineers at the Massachusetts Institute of Technology (MIT) completed Whirlwind I, designed to train pilots . Project Whirlwind introduced magnetic core memory to the world.

The first commercial computer was 1951's UNIVAC (Universal Automatic Computer) built for the U.S. Census Bureau by the makers of ENIAC. It was huge, weighed 16,000 pounds (7,258 kilograms) and had 5,000 vacuum tubes. It rose to fame when it correctly predicted Dwight D. Eisenhower's landslide presidential victory when only a small percentage of votes had been counted. UNIVAC could do 1,000 calculations in a second, an amazing feat at the time.

In 1956, IBM's 305 Random Access Memory Accounting System (RAMAC) was the first with a hard drive. Piece by piece, the modern electronic computer was starting to come together.

In 1968, Douglas Engelbart demonstrated a prototype of the modern computer, which included a mouse and graphical user interface (windows, icons and a menu). This showed that the computer could be of benefit to more than academics and technical experts and reach the lay public.

Bill Hewlett and Dave Packard, two friends who met on a camping trip, began working in a garage in Palo Alto, California. Their first product was an oscillator to test audio equipment. Hewlett-Packard's HP 9100A scientific calculator was released in 1968 and used the phrase "personal computer" in its advertising. The HP-85, released in 1980, was their (actual) first PC.

Steve Wozniak and Steve Jobs both had experience at Hewlett-Packard. While still in school, Jobs landed an internship by cold-calling Bill Hewlett . Wozniak not only worked for HP but also offered the design for the Apple I personal computer to the company five times, and was turned down each time .

Eventually, the two Steves left HP to start their own company in a garage, just as Hewlett and Packard had. The Apple I launched in 1976, followed by the Apple II in 1977. The Apple I was the first "fully assembled" personal computer, though buyers still needed a case, power supply, keyboard, and display to go along with the fully assembled circuit board. The Apple II included a case with a keyboard, plus more RAM and color graphics.

IBM's 5150 Personal Computer, released in 1981 , put computers on the desktops of businesses around the world and came with a system unit, a keyboard and a color/graphics capability. It used the MS-DOS operating system from Microsoft. Throughout the 1980s, computers got less expensive and included more features until they became indispensable to almost every home and business.

Who Invented Computer FAQ

When was the first computer invented, is the abacus the first computer, who invented the laptop, what is the first modern computer in the world, lots more information, related articles.

  • How will computers evolve over the next 100 years?
  • How Microprocessors Work
  • How PCs Work
  • 10 Types of Computers
  • "Analytical Engine." Encyclopædia Britannica. Encyclopædia Britannica Online. Encyclopædia Britannica, April 4, 2022. (Dec. 14, 2022) http://www.britannica.com/EBchecked/topic/22628/Analytical-Engine
  • Babbage Engine, The. Computer History Museum, online exhibit. (Dec. 14 2022) http://www.computerhistory.org/babbage/
  • Campbell-Kelly, Martin. "Origin of Computing." Scientific American. Sept. 1, 2009. (Dec. 14, 2022) https://www.scientificamerican.com/article/origin-of-computing/
  • <a>George, Aleta. "Booting Up a Computer Pioneer's 200-Year-Old Design." Smithsonian magazine. April 1, 2009. (Dec. 14, 2022) https://www.smithsonianmag.com/science-nature/booting-up-a-computer-pioneers-200-year-old-design-122424896/</a>
  • Kim, Eugene Eric and Betty Alexandra Toole. "Ada and the First Computer." Scientific American. May 1, 1999. (Dec. 14, 2022) https://www.scientificamerican.com/article/ada-and-the-first-computer/
  • <a>Park, Edwards. "The Object at Hand." Smithsonian magazine. October 1995. (Dec. 14, 2022) https://www.smithsonianmag.com/articles/the-object-at-hand-1-102561973</a>
  • Stoll, Cliff. "When Slide Rules Ruled." Scientific American. May 1, 2006. (Dec. 14, 2022) <a>https://www.scientificamerican.com/article/when-slide-rules-ruled/</a>

Please copy/paste the following text to properly cite this HowStuffWorks.com article:

Hispanic woman listening to computer with headphones

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

  • Collections
  • Publications
  • K-12 Students & Educators
  • Families & Community Groups
  • Colleges & Universities
  • Business & Government Leaders
  • Make a Plan
  • Exhibits at the Museum
  • Tours & Group Reservations
  • Customize It
  • This is CHM
  • Ways to Give
  • Donor Recognition
  • Institutional Partnerships
  • Upcoming Events
  • Hours & Directions
  • Subscribe Now
  • Terms of Use
  • By Category

Bell Laboratories scientist George Stibitz uses relays for a demonstration adder

essay about invention of computer

“Model K” Adder

Called the “Model K” Adder because he built it on his “Kitchen” table, this simple demonstration circuit provides proof of concept for applying Boolean logic to the design of computers, resulting in construction of the relay-based Model I Complex Calculator in 1939. That same year in Germany, engineer Konrad Zuse built his Z2 computer, also using telephone company relays.

Hewlett-Packard is founded

essay about invention of computer

Hewlett and Packard in their garage workshop

David Packard and Bill Hewlett found their company in a Palo Alto, California garage. Their first product, the HP 200A Audio Oscillator, rapidly became a popular piece of test equipment for engineers. Walt Disney Pictures ordered eight of the 200B model to test recording equipment and speaker systems for the 12 specially equipped theatres that showed the movie “Fantasia” in 1940.

The Complex Number Calculator (CNC) is completed

essay about invention of computer

Operator at Complex Number Calculator (CNC)

In 1939, Bell Telephone Laboratories completes this calculator, designed by scientist George Stibitz. In 1940, Stibitz demonstrated the CNC at an American Mathematical Society conference held at Dartmouth College. Stibitz stunned the group by performing calculations remotely on the CNC (located in New York City) using a Teletype terminal connected to New York over special telephone lines. This is likely the first example of remote access computing.

Konrad Zuse finishes the Z3 Computer

essay about invention of computer

The Zuse Z3 Computer

The Z3, an early computer built by German engineer Konrad Zuse working in complete isolation from developments elsewhere, uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943. Zuse later supervised a reconstruction of the Z3 in the 1960s, which is currently on display at the Deutsches Museum in Munich.

The first Bombe is completed

essay about invention of computer

Bombe replica, Bletchley Park, UK

Built as an electro-mechanical means of decrypting Nazi ENIGMA-based military communications during World War II, the British Bombe is conceived of by computer pioneer Alan Turing and Harold Keen of the British Tabulating Machine Company. Hundreds of allied bombes were built in order to determine the daily rotor start positions of Enigma cipher machines, which in turn allowed the Allies to decrypt German messages. The basic idea for bombes came from Polish code-breaker Marian Rejewski's 1938 "Bomba."

The Atanasoff-Berry Computer (ABC) is completed

essay about invention of computer

The Atanasoff-Berry Computer

After successfully demonstrating a proof-of-concept prototype in 1939, Professor John Vincent Atanasoff receives funds to build a full-scale machine at Iowa State College (now University). The machine was designed and built by Atanasoff and graduate student Clifford Berry between 1939 and 1942. The ABC was at the center of a patent dispute related to the invention of the computer, which was resolved in 1973 when it was shown that ENIAC co-designer John Mauchly had seen the ABC shortly after it became functional.

The legal result was a landmark: Atanasoff was declared the originator of several basic computer ideas, but the computer as a concept was declared un-patentable and thus freely open to all. A full-scale working replica of the ABC was completed in 1997, proving that the ABC machine functioned as Atanasoff had claimed. The replica is currently on display at the Computer History Museum.

Bell Labs Relay Interpolator is completed

essay about invention of computer

George Stibitz circa 1940

The US Army asked Bell Laboratories to design a machine to assist in testing its M-9 gun director, a type of analog computer that aims large guns to their targets. Mathematician George Stibitz recommends using a relay-based calculator for the project. The result was the Relay Interpolator, later called the Bell Labs Model II. The Relay Interpolator used 440 relays, and since it was programmable by paper tape, was used for other applications following the war.

Curt Herzstark designs Curta calculator

essay about invention of computer

Curta Model 1 calculator

Curt Herzstark was an Austrian engineer who worked in his family’s manufacturing business until he was arrested by the Nazis in 1943. While imprisoned at Buchenwald concentration camp for the rest of World War II, he refines his pre-war design of a calculator featuring a modified version of Leibniz’s “stepped drum” design. After the war, Herzstark’s Curta made history as the smallest all-mechanical, four-function calculator ever built.

First Colossus operational at Bletchley Park

essay about invention of computer

The Colossus at work at Bletchley Park

Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II. A total of ten Colossi were delivered, each using as many as 2,500 vacuum tubes. A series of pulleys transported continuous rolls of punched paper tape containing possible solutions to a particular code. Colossus reduced the time to break Lorenz messages from weeks to hours. Most historians believe that the use of Colossus machines significantly shortened the war by providing evidence of enemy intentions and beliefs. The machine’s existence was not made public until the 1970s.

Harvard Mark 1 is completed

essay about invention of computer

Conceived by Harvard physics professor Howard Aiken, and designed and built by IBM, the Harvard Mark 1 is a room-sized, relay-based calculator. The machine had a fifty-foot long camshaft running the length of machine that synchronized the machine’s thousands of component parts and used 3,500 relays. The Mark 1 produced mathematical tables but was soon superseded by electronic stored-program computers.

John von Neumann writes First Draft of a Report on the EDVAC

essay about invention of computer

John von Neumann

In a widely circulated paper, mathematician John von Neumann outlines the architecture of a stored-program computer, including electronic storage of programming information and data -- which eliminates the need for more clumsy methods of programming such as plugboards, punched cards and paper. Hungarian-born von Neumann demonstrated prodigious expertise in hydrodynamics, ballistics, meteorology, game theory, statistics, and the use of mechanical devices for computation. After the war, he concentrated on the development of Princeton´s Institute for Advanced Studies computer.

Moore School lectures take place

essay about invention of computer

The Moore School Building at the University of Pennsylvania

An inspiring summer school on computing at the University of Pennsylvania´s Moore School of Electrical Engineering stimulates construction of stored-program computers at universities and research institutions in the US, France, the UK, and Germany. Among the lecturers were early computer designers like John von Neumann, Howard Aiken, J. Presper Eckert and John Mauchly, as well as mathematicians including Derrick Lehmer, George Stibitz, and Douglas Hartree. Students included future computing pioneers such as Maurice Wilkes, Claude Shannon, David Rees, and Jay Forrester. This free, public set of lectures inspired the EDSAC, BINAC, and, later, IAS machine clones like the AVIDAC.

Project Whirlwind begins

essay about invention of computer

Whirlwind installation at MIT

During World War II, the US Navy approaches the Massachusetts Institute of Technology (MIT) about building a flight simulator to train bomber crews. Under the leadership of MIT's Gordon Brown and Jay Forrester, the team first built a small analog simulator, but found it inaccurate and inflexible. News of the groundbreaking electronic ENIAC computer that same year inspired the group to change course and attempt a digital solution, whereby flight variables could be rapidly programmed in software. Completed in 1951, Whirlwind remains one of the most important computer projects in the history of computing. Foremost among its developments was Forrester’s perfection of magnetic core memory, which became the dominant form of high-speed random access memory for computers until the mid-1970s.

Public unveiling of ENIAC

essay about invention of computer

Started in 1943, the ENIAC computing system was built by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering of the University of Pennsylvania. Because of its electronic, as opposed to electromechanical, technology, it is over 1,000 times faster than any previous computer. ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes and weighed 30 tons. It was believed that ENIAC had done more calculation over the ten years it was in operation than all of humanity had until that time.

First Computer Program to Run on a Computer

essay about invention of computer

Kilburn (left) and Williams in front of 'Baby'

University of Manchester researchers Frederic Williams, Tom Kilburn, and Geoff Toothill develop the Small-Scale Experimental Machine (SSEM), better known as the Manchester "Baby." The Baby was built to test a new memory technology developed by Williams and Kilburn -- soon known as the Williams Tube – which was the first high-speed electronic random access memory for computers. Their first program, consisting of seventeen instructions and written by Kilburn, ran on June 21st, 1948. This was the first program in history to run on a digital, electronic, stored-program computer.

SSEC goes on display

essay about invention of computer

IBM Selective Sequence Electronic Calculator (SSEC)

The Selective Sequence Electronic Calculator (SSEC) project, led by IBM engineer Wallace Eckert, uses both relays and vacuum tubes to process scientific data at the rate of 50 14 x 14 digit multiplications per second. Before its decommissioning in 1952, the SSEC produced the moon position tables used in early planning of the 1969 Apollo XII moon landing. These tables were later confirmed by using more modern computers for the actual flights. The SSEC was one of the last of the generation of 'super calculators' to be built using electromechanical technology.

CSIRAC runs first program

essay about invention of computer

While many early digital computers were based on similar designs, such as the IAS and its copies, others are unique designs, like the CSIRAC. Built in Sydney, Australia by the Council of Scientific and Industrial Research for use in its Radio physics Laboratory in Sydney, CSIRAC was designed by British-born Trevor Pearcey, and used unusual 12-hole paper tape. It was transferred to the Department of Physics at the University of Melbourne in 1955 and remained in service until 1964.

EDSAC completed

essay about invention of computer

The first practical stored-program computer to provide a regular computing service, EDSAC is built at Cambridge University using vacuum tubes and mercury delay lines for memory. The EDSAC project was led by Cambridge professor and director of the Cambridge Computation Laboratory, Maurice Wilkes. Wilkes' ideas grew out of the Moore School lectures he had attended three years earlier. One major advance in programming was Wilkes' use of a library of short programs, called “subroutines,” stored on punched paper tapes and used for performing common repetitive calculations within a larger program.

MADDIDA developed

essay about invention of computer

MADDIDA (Magnetic Drum Digital Differential Analyzer) prototype

MADDIDA is a digital drum-based differential analyzer. This type of computer is useful in performing many of the mathematical equations scientists and engineers encounter in their work. It was originally created for a nuclear missile design project in 1949 by a team led by Fred Steele. It used 53 vacuum tubes and hundreds of germanium diodes, with a magnetic drum for memory. Tracks on the drum did the mathematical integration. MADDIDA was flown across the country for a demonstration to John von Neumann, who was impressed. Northrop was initially reluctant to make MADDIDA a commercial product, but by the end of 1952, six had sold.

Manchester Mark I completed

essay about invention of computer

Manchester Mark I

Built by a team led by engineers Frederick Williams and Tom Kilburn, the Mark I serves as the prototype for Ferranti’s first computer – the Ferranti Mark 1. The Manchester Mark I used more than 1,300 vacuum tubes and occupied an area the size of a medium room. Its “Williams-Kilburn tube” memory system was later adopted by several other early computer systems around the world.

ERA 1101 introduced

essay about invention of computer

One of the first commercially produced computers, the company´s first customer was the US Navy. The 1101, designed by ERA but built by Remington-Rand, was intended for high-speed computing and stored 1 million bits on its magnetic drum, one of the earliest magnetic storage devices and a technology which ERA had done much to perfect in its own laboratories. Many of the 1101’s basic architectural details were used again in later Remington-Rand computers until the 1960s.

NPL Pilot ACE completed

essay about invention of computer

Based on ideas from Alan Turing, Britain´s Pilot ACE computer is constructed at the National Physical Laboratory. "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus," Turing said at a symposium on large-scale digital calculating machinery in 1947 in Cambridge, Massachusetts. The design packed 800 vacuum tubes into a relatively compact 12 square feet.

Plans to build the Simon 1 relay logic machine are published

essay about invention of computer

Simon featured on the November 1950 Scientific American cover

The hobbyist magazine Radio Electronics publishes Edmund Berkeley's design for the Simon 1 relay computer from 1950 to 1951. The Simon 1 used relay logic and cost about $600 to build. In his book Giant Brains , Berkeley noted - “We shall now consider how we can design a very simple machine that will think. Let us call it Simon, because of its predecessor, Simple Simon... Simon is so simple and so small in fact that it could be built to fill up less space than a grocery-store box; about four cubic feet.”

SEAC and SWAC completed

essay about invention of computer

The Standards Eastern Automatic Computer (SEAC) is among the first stored program computers completed in the United States. It was built in Washington DC as a test-bed for evaluating components and systems as well as for setting computer standards. It was also one of the first computers to use all-diode logic, a technology more reliable than vacuum tubes. The world's first scanned image was made on SEAC by engineer Russell Kirsch in 1957.

The NBS also built the Standards Western Automatic Computer (SWAC) at the Institute for Numerical Analysis on the UCLA campus. Rather than testing components like the SEAC, the SWAC was built using already-developed technology. SWAC was used to solve problems in numerical analysis, including developing climate models and discovering five previously unknown Mersenne prime numbers.

Ferranti Mark I sold

essay about invention of computer

Ferranti Mark 1

The title of “first commercially available general-purpose computer” probably goes to Britain’s Ferranti Mark I for its sale of its first Mark I computer to Manchester University. The Mark 1 was a refinement of the experimental Manchester “Baby” and Manchester Mark 1 computers, also at Manchester University. A British government contract spurred its initial development but a change in government led to loss of funding and the second and only other Mark I was sold at a major loss to the University of Toronto, where it was re-christened FERUT.

First Univac 1 delivered to US Census Bureau

essay about invention of computer

Univac 1 installation

The Univac 1 is the first commercial computer to attract widespread public attention. Although manufactured by Remington Rand, the machine was often mistakenly referred to as “the IBM Univac." Univac computers were used in many different applications but utilities, insurance companies and the US military were major customers. One biblical scholar even used a Univac 1 to compile a concordance to the King James version of the Bible. Created by Presper Eckert and John Mauchly -- designers of the earlier ENIAC computer -- the Univac 1 used 5,200 vacuum tubes and weighed 29,000 pounds. Remington Rand eventually sold 46 Univac 1s at more than $1 million each.

J. Lyons & Company introduce LEO-1

essay about invention of computer

Modeled after the Cambridge University EDSAC computer, the president of Lyons Tea Co. has the LEO built to solve the problem of production scheduling and delivery of cakes to the hundreds of Lyons tea shops around England. After the success of the first LEO, Lyons went into business manufacturing computers to meet the growing need for data processing systems in business. The LEO was England’s first commercial computer and was performing useful work before any other commercial computer system in the world.

IAS computer operational

essay about invention of computer

MANIAC at Los Alamos

The Institute of Advanced Study (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous mathematician John von Neumann. The notion of storing both data and instructions in memory became known as the ‘stored program concept’ to distinguish it from earlier methods of instructing a computer. The IAS computer was designed for scientific calculations and it performed essential work for the US atomic weapons program. Over the next few years, the basic design of the IAS machine was copied in at least 17 places and given similar-sounding names, for example, the MANIAC at Los Alamos Scientific Laboratory; the ILLIAC at the University of Illinois; the Johnniac at The Rand Corporation; and the SILLIAC in Australia.

Grimsdale and Webb build early transistorized computer

essay about invention of computer

Manchester transistorized computer

Working under Tom Kilburn at England’s Manchester University, Richard Grimsdale and Douglas Webb demonstrate a prototype transistorized computer, the "Manchester TC", on November 16, 1953. The 48-bit machine used 92 point-contact transistors and 550 diodes.

IBM ships its Model 701 Electronic Data Processing Machine

essay about invention of computer

Cuthbert Hurd (standing) and Thomas Watson, Sr. at IBM 701 console

During three years of production, IBM sells 19 701s to research laboratories, aircraft companies, and the federal government. Also known inside IBM as the “Defense Calculator," the 701 rented for $15,000 a month. Programmer Arthur Samuels used the 701 to write the first computer program designed to play checkers. The 701 introduction also marked the beginning of IBM’s entry into the large-scale computer market, a market it came to dominate in later decades.

RAND Corporation completes Johnniac computer

essay about invention of computer

RAND Corporation’s Johnniac

The Johnniac computer is one of 17 computers that followed the basic design of Princeton's Institute of Advanced Study (IAS) computer. It was named after John von Neumann, a world famous mathematician and computer pioneer of the day. Johnniac was used for scientific and engineering calculations. It was also repeatedly expanded and improved throughout its 13-year lifespan. Many innovative programs were created for Johnniac, including the time-sharing system JOSS that allowed many users to simultaneously access the machine.

IBM 650 magnetic drum calculator introduced

essay about invention of computer

IBM establishes the 650 as its first mass-produced computer, with the company selling 450 in just one year. Spinning at 12,500 rpm, the 650´s magnetic data-storage drum allowed much faster access to stored information than other drum-based machines. The Model 650 was also highly popular in universities, where a generation of students first learned programming.

English Electric DEUCE introduced

essay about invention of computer

English Electric DEUCE

A commercial version of Alan Turing's Pilot ACE, called DEUCE—the Digital Electronic Universal Computing Engine -- is used mostly for science and engineering problems and a few commercial applications. Over 30 were completed, including one delivered to Australia.

Direct keyboard input to computers

essay about invention of computer

Joe Thompson at Whirlwind console, ca. 1951

At MIT, researchers begin experimenting with direct keyboard input to computers, a precursor to today´s normal mode of operation. Typically, computer users of the time fed their programs into a computer using punched cards or paper tape. Doug Ross wrote a memo advocating direct access in February. Ross contended that a Flexowriter -- an electrically-controlled typewriter -- connected to an MIT computer could function as a keyboard input device due to its low cost and flexibility. An experiment conducted five months later on the MIT Whirlwind computer confirmed how useful and convenient a keyboard input device could be.

Librascope LGP-30 introduced

essay about invention of computer

Physicist Stan Frankel, intrigued by small, general-purpose computers, developed the MINAC at Caltech. The Librascope division of defense contractor General Precision buys Frankel’s design, renaming it the LGP-30 in 1956. Used for science and engineering as well as simple data processing, the LGP-30 was a “bargain” at less than $50,000 and an early example of a ‘personal computer,’ that is, a computer made for a single user.

MIT researchers build the TX-0

essay about invention of computer

TX-0 at MIT

The TX-0 (“Transistor eXperimental - 0”) is the first general-purpose programmable computer built with transistors. For easy replacement, designers placed each transistor circuit inside a "bottle," similar to a vacuum tube. Constructed at MIT´s Lincoln Laboratory, the TX-0 moved to the MIT Research Laboratory of Electronics, where it hosted some early imaginative tests of programming, including writing a Western movie shown on television, 3-D tic-tac-toe, and a maze in which a mouse found martinis and became increasingly inebriated.

Digital Equipment Corporation (DEC) founded

essay about invention of computer

The Maynard mill

DEC is founded initially to make electronic modules for test, measurement, prototyping and control markets. Its founders were Ken and Stan Olsen, and Harlan Anderson. Headquartered in Maynard, Massachusetts, Digital Equipment Corporation, took over 8,680 square foot leased space in a nineteenth century mill that once produced blankets and uniforms for soldiers who fought in the Civil War. General Georges Doriot and his pioneering venture capital firm, American Research and Development, invested $70,000 for 70% of DEC’s stock to launch the company in 1957. The mill is still in use today as an office park (Clock Tower Place) today.

RCA introduces its Model 501 transistorized computer

essay about invention of computer

RCA 501 brochure cover

The 501 is built on a 'building block' concept which allows it to be highly flexible for many different uses and could simultaneously control up to 63 tape drives—very useful for large databases of information. For many business users, quick access to this huge storage capability outweighed its relatively slow processing speed. Customers included US military as well as industry.

SAGE system goes online

essay about invention of computer

SAGE Operator Station

The first large-scale computer communications network, SAGE connects 23 hardened computer sites in the US and Canada. Its task was to detect incoming Soviet bombers and direct interceptor aircraft to destroy them. Operators directed actions by touching a light gun to the SAGE airspace display. The air defense system used two AN/FSQ-7 computers, each of which used a full megawatt of power to drive its 55,000 vacuum tubes, 175,000 diodes and 13,000 transistors.

DEC PDP-1 introduced

essay about invention of computer

Ed Fredkin at DEC PDP-1

The typical PDP-1 computer system, which sells for about $120,000, includes a cathode ray tube graphic display, paper tape input/output, needs no air conditioning and requires only one operator; all of which become standards for minicomputers. Its large scope intrigued early hackers at MIT, who wrote the first computerized video game, SpaceWar! , as well as programs to play music. More than 50 PDP-1s were sold.

NEAC 2203 goes online

essay about invention of computer

NEAC 2203 transistorized computer

An early transistorized computer, the NEAC (Nippon Electric Automatic Computer) includes a CPU, console, paper tape reader and punch, printer and magnetic tape units. It was sold exclusively in Japan, but could process alphabetic and Japanese kana characters. Only about thirty NEACs were sold. It managed Japan's first on-line, real-time reservation system for Kinki Nippon Railways in 1960. The last one was decommissioned in 1979.

IBM 7030 (“Stretch”) completed

essay about invention of computer

IBM Stretch

IBM´s 7000 series of mainframe computers are the company´s first to use transistors. At the top of the line was the Model 7030, also known as "Stretch." Nine of the computers, which featured dozens of advanced design innovations, were sold, mainly to national laboratories and major scientific users. A special version, known as HARVEST, was developed for the US National Security Agency (NSA). The knowledge and technologies developed for the Stretch project played a major role in the design, management, and manufacture of the later IBM System/360--the most successful computer family in IBM history.

IBM Introduces 1400 series

essay about invention of computer

The 1401 mainframe, the first in the series, replaces earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.

Minuteman I missile guidance computer developed

essay about invention of computer

Minuteman Guidance computer

Minuteman missiles use transistorized computers to continuously calculate their position in flight. The computer had to be rugged and fast, with advanced circuit design and reliable packaging able to withstand the forces of a missile launch. The military’s high standards for its transistors pushed manufacturers to improve quality control. When the Minuteman I was decommissioned, some universities received these computers for use by students.

Naval Tactical Data System introduced

essay about invention of computer

Naval Tactical Data System (NTDS)

The US Navy Tactical Data System uses computers to integrate and display shipboard radar, sonar and communications data. This real-time information system began operating in the early 1960s. In October 1961, the Navy tested the NTDS on the USS Oriskany carrier and the USS King and USS Mahan frigates. After being successfully used for decades, NTDS was phased out in favor of the newer AEGIS system in the 1980s.

MIT LINC introduced

essay about invention of computer

Wesley Clark with LINC

The LINC is an early and important example of a ‘personal computer,’ that is, a computer designed for only one user. It was designed by MIT Lincoln Laboratory engineer Wesley Clark. Under the auspices of a National Institutes of Health (NIH) grant, biomedical research faculty from around the United States came to a workshop at MIT to build their own LINCs, and then bring them back to their home institutions where they would be used. For research, Digital Equipment Corporation (DEC) supplied the components, and 50 original LINCs were made. The LINC was later commercialized by DEC and sold as the LINC-8.

The Atlas Computer debuts

essay about invention of computer

Chilton Atlas installation

A joint project of England’s Manchester University, Ferranti Computers, and Plessey, Atlas comes online nine years after Manchester’s computer lab begins exploring transistor technology. Atlas was the fastest computer in the world at the time and introduced the concept of “virtual memory,” that is, using a disk or drum as an extension of main memory. System control was provided through the Atlas Supervisor, which some consider to be the first true operating system.

CDC 6600 supercomputer introduced

essay about invention of computer

The Control Data Corporation (CDC) 6600 performs up to 3 million instructions per second —three times faster than that of its closest competitor, the IBM 7030 supercomputer. The 6600 retained the distinction of being the fastest computer in the world until surpassed by its successor, the CDC 7600, in 1968. Part of the speed came from the computer´s design, which used 10 small computers, known as peripheral processing units, to offload the workload from the central processor.

Digital Equipment Corporation introduces the PDP-8

essay about invention of computer

PDP-8 advertisement

The Canadian Chalk River Nuclear Lab needed a special device to monitor a reactor. Instead of designing a custom controller, two young engineers from Digital Equipment Corporation (DEC) -- Gordon Bell and Edson de Castro -- do something unusual: they develop a small, general purpose computer and program it to do the job. A later version of that machine became the PDP-8, the first commercially successful minicomputer. The PDP-8 sold for $18,000, one-fifth the price of a small IBM System/360 mainframe. Because of its speed, small size, and reasonable cost, the PDP-8 was sold by the thousands to manufacturing plants, small businesses, and scientific laboratories around the world.

IBM announces System/360

essay about invention of computer

IBM 360 Model 40

System/360 is a major event in the history of computing. On April 7, IBM announced five models of System/360, spanning a 50-to-1 performance range. At the same press conference, IBM also announced 40 completely new peripherals for the new family. System/360 was aimed at both business and scientific customers and all models could run the same software, largely without modification. IBM’s initial investment of $5 billion was quickly returned as orders for the system climbed to 1,000 per month within two years. At the time IBM released the System/360, the company had just made the transition from discrete transistors to integrated circuits, and its major source of revenue began to move from punched card equipment to electronic computer systems.

SABRE comes on-line

essay about invention of computer

Airline reservation agents working with SABRE

SABRE is a joint project between American Airlines and IBM. Operational by 1964, it was not the first computerized reservation system, but it was well publicized and became very influential. Running on dual IBM 7090 mainframe computer systems, SABRE was inspired by IBM’s earlier work on the SAGE air-defense system. Eventually, SABRE expanded, even making airline reservations available via on-line services such as CompuServe, Genie, and America Online.

Teletype introduced its ASR-33 Teletype

essay about invention of computer

Student using ASR-33

At a cost to computer makers of roughly $700, the ASR-33 Teletype is originally designed as a low cost terminal for the Western Union communications network. Throughout the 1960s and ‘70s, the ASR-33 was a popular and inexpensive choice of input and output device for minicomputers and many of the first generation of microcomputers.

3C DDP-116 introduced

essay about invention of computer

DDP-116 General Purpose Computer

Designed by engineer Gardner Hendrie for Computer Control Corporation (CCC), the DDP-116 is announced at the 1965 Spring Joint Computer Conference. It was the world's first commercial 16-bit minicomputer and 172 systems were sold. The basic computer cost $28,500.

Olivetti Programma 101 is released

essay about invention of computer

Olivetti Programma 101

Announced the year previously at the New York World's Fair the Programma 101 goes on sale. This printing programmable calculator was made from discrete transistors and an acoustic delay-line memory. The Programma 101 could do addition, subtraction, multiplication, and division, as well as calculate square roots. 40,000 were sold, including 10 to NASA for use on the Apollo space project.

HP introduces the HP 2116A

essay about invention of computer

HP 2116A system

The 2116A is HP’s first computer. It was developed as a versatile instrument controller for HP's growing family of programmable test and measurement products. It interfaced with a wide number of standard laboratory instruments, allowing customers to computerize their instrument systems. The 2116A also marked HP's first use of integrated circuits in a commercial product.

ILLIAC IV project begins

essay about invention of computer

A large parallel processing computer, the ILLIAC IV does not operate until 1972. It was eventually housed at NASA´s Ames Research Center in Mountain View, California. The most ambitious massively parallel computer at the time, the ILLIAC IV was plagued with design and production problems. Once finally completed, it achieved a computational speed of 200 million instructions per second and 1 billion bits per second of I/O transfer via a unique combination of its parallel architecture and the overlapping or "pipelining" structure of its 64 processing elements.

RCA announces its Spectra series of computers

essay about invention of computer

Image from RCA Spectra-70 brochure

The first large commercial computers to use integrated circuits, RCA highlights the IC's advantage over IBM’s custom SLT modules. Spectra systems were marketed on the basis of their compatibility with the IBM System/360 series of computer since it implemented the IBM 360 instruction set and could run most IBM software with little or no modification.

Apollo Guidance Computer (AGC) makes its debut

essay about invention of computer

DSKY interface for the Apollo Guidance Computer

Designed by scientists and engineers at MIT’s Instrumentation Laboratory, the Apollo Guidance Computer (AGC) is the culmination of years of work to reduce the size of the Apollo spacecraft computer from the size of seven refrigerators side-by-side to a compact unit weighing only 70 lbs. and taking up a volume of less than 1 cubic foot. The AGC’s first flight was on Apollo 7. A year later, it steered Apollo 11 to the lunar surface. Astronauts communicated with the computer by punching two-digit codes into the display and keyboard unit (DSKY). The AGC was one of the earliest uses of integrated circuits, and used core memory, as well as read-only magnetic rope memory. The astronauts were responsible for entering more than 10,000 commands into the AGC for each trip between Earth and the Moon.

Data General Corporation introduces the Nova Minicomputer

essay about invention of computer

Edson deCastro with a Data General Nova

Started by a group of engineers that left Digital Equipment Corporation (DEC), Data General designs the Nova minicomputer. It had 32 KB of memory and sold for $8,000. Ed de Castro, its main designer and co-founder of Data General, had earlier led the team that created the DEC PDP-8. The Nova line of computers continued through the 1970s, and influenced later systems like the Xerox Alto and Apple 1.

Amdahl Corporation introduces the Amdahl 470

essay about invention of computer

Gene Amdahl with 470V/6 model

Gene Amdahl, father of the IBM System/360, starts his own company, Amdahl Corporation, to compete with IBM in mainframe computer systems. The 470V/6 was the company’s first product and ran the same software as IBM System/370 computers but cost less and was smaller and faster.

First Kenbak-1 is sold

essay about invention of computer

One of the earliest personal computers, the Kenbak-1 is advertised for $750 in Scientific American magazine. Designed by John V. Blankenbaker using standard medium-- and small-scale integrated circuits, the Kenbak-1 relied on switches for input and lights for output from its 256-byte memory. In 1973, after selling only 40 machines, Kenbak Corporation closed its doors.

Hewlett-Packard introduces the HP-35

essay about invention of computer

HP-35 handheld calculator

Initially designed for internal use by HP employees, co-founder Bill Hewlett issues a challenge to his engineers in 1971: fit all of the features of their desktop scientific calculator into a package small enough for his shirt pocket. They did. Marketed as “a fast, extremely accurate electronic slide rule” with a solid-state memory similar to that of a computer, the HP-35 distinguished itself from its competitors by its ability to perform a broad variety of logarithmic and trigonometric functions, to store more intermediate solutions for later use, and to accept and display entries in a form similar to standard scientific notation. The HP-35 helped HP become one of the most dominant companies in the handheld calculator market for more than two decades.

Intel introduces the first microprocessor

essay about invention of computer

Advertisement for Intel's 4004

Computer History Museum

The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News. Developed for Busicom, a Japanese calculator maker, the 4004 had 2250 transistors and could perform up to 90,000 operations per second in four-bit chunks. Federico Faggin led the design and Ted Hoff led the architecture.

Laser printer invented at Xerox PARC

essay about invention of computer

Dover laser printer

Xerox PARC physicist Gary Starkweather realizes in 1967 that exposing a copy machine’s light-sensitive drum to a paper original isn’t the only way to create an image. A computer could “write” it with a laser instead. Xerox wasn’t interested. So in 1971, Starkweather transferred to Xerox Palo Alto Research Center (PARC), away from corporate oversight. Within a year, he had built the world’s first laser printer, launching a new era in computer printing, generating billions of dollars in revenue for Xerox. The laser printer was used with PARC’s Alto computer, and was commercialized as the Xerox 9700.

IBM SCAMP is developed

essay about invention of computer

Dr. Paul Friedl with SCAMP prototype

Under the direction of engineer Dr. Paul Friedl, the Special Computer APL Machine Portable (SCAMP) personal computer prototype is developed at IBM's Los Gatos and Palo Alto, California laboratories. IBM’s first personal computer, the system was designed to run the APL programming language in a compact, briefcase-like enclosure which comprised a keyboard, CRT display, and cassette tape storage. Friedl used the SCAMP prototype to gain approval within IBM to promote and develop IBM’s 5100 family of computers, including the most successful, the 5150, also known as the IBM Personal Computer (PC), introduced in 1981. From concept to finished system, SCAMP took only six months to develop.

Micral is released

essay about invention of computer

Based on the Intel 8008 microprocessor, the Micral is one of the earliest commercial, non-kit personal computers. Designer Thi Truong developed the computer while Philippe Kahn wrote the software. Truong, founder and president of the French company R2E, created the Micral as a replacement for minicomputers in situations that did not require high performance, such as process control and highway toll collection. Selling for $1,750, the Micral never penetrated the U.S. market. In 1979, Truong sold R2E to Bull.

The TV Typewriter plans are published

essay about invention of computer

TV Typewriter

Designed by Don Lancaster, the TV Typewriter is an easy-to-build kit that can display alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of hobbyist magazine Radio Electronics . The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A cassette tape interface provided supplementary storage for text. The TV Typewriter was used by many small television stations well in the 1990s.

Wang Laboratories releases the Wang 2200

essay about invention of computer

Wang was a successful calculator manufacturer, then a successful word processor company. The 1973 Wang 2200 makes it a successful computer company, too. Wang sold the 2200 primarily through Value Added Resellers, who added special software to solve specific customer problems. The 2200 used a built-in CRT, cassette tape for storage, and ran the programming language BASIC. The PC era ended Wang’s success, and it filed for bankruptcy in 1992.

Scelbi advertises its 8H computer

essay about invention of computer

The first commercially advertised US computer based on a microprocessor (the Intel 8008,) the Scelbi has 4 KB of internal memory and a cassette tape interface, as well as Teletype and oscilloscope interfaces. Scelbi aimed the 8H, available both in kit form and fully assembled, at scientific, electronic, and biological applications. In 1975, Scelbi introduced the 8B version with 16 KB of memory for the business market. The company sold about 200 machines, losing $500 per unit.

The Mark-8 appears in the pages of Radio-Electronics

essay about invention of computer

Mark-8 featured on Radio-Electronics July 1974 cover

The Mark-8 “Do-It-Yourself” kit is designed by graduate student John Titus and uses the Intel 8008 microprocessor. The kit was the cover story of hobbyist magazine Radio-Electronics in July 1974 – six months before the MITS Altair 8800 was in rival Popular Electronics magazine. Plans for the Mark-8 cost $5 and the blank circuit boards were available for $50.

Xerox PARC Alto introduced

essay about invention of computer

The Alto is a groundbreaking computer with wide influence on the computer industry. It was based on a graphical user interface using windows, icons, and a mouse, and worked together with other Altos over a local area network. It could also share files and print out documents on an advanced Xerox laser printer. Applications were also highly innovative: a WYSISYG word processor known as “Bravo,” a paint program, a graphics editor, and email for example. Apple’s inspiration for the Lisa and Macintosh computers came from the Xerox Alto.

MITS Altair 8800 kit appears in Popular Electronics

essay about invention of computer

Altair 8800

For its January issue, hobbyist magazine Popular Electronics runs a cover story of a new computer kit – the Altair 8800. Within weeks of its appearance, customers inundated its maker, MITS, with orders. Bill Gates and Paul Allen licensed their BASIC programming language interpreter to MITS as the main language for the Altair. MITS co-founder Ed Roberts invented the Altair 8800 — which sold for $297, or $395 with a case — and coined the term “personal computer”. The machine came with 256 bytes of memory (expandable to 64 KB) and an open 100-line bus structure that evolved into the “S-100” standard widely used in hobbyist and personal computers of this era. In 1977, MITS was sold to Pertec, which continued producing Altairs in 1978.

MOS 6502 is introduced

essay about invention of computer

MOS 6502 ad from IEEE Computer, Sept. 1975

Chuck Peddle leads a small team of former Motorola employees to build a low-cost microprocessor. The MOS 6502 was introduced at a conference in San Francisco at a cost of $25, far less than comparable processors from Intel and Motorola, leading some attendees to believe that the company was perpetrating a hoax. The chip quickly became popular with designers of early personal computers like the Apple II and Commodore PET, as well as game consoles like the Nintendo Entertainment System. The 6502 and its progeny are still used today, usually in embedded applications.

Southwest Technical Products introduces the SWTPC 6800

essay about invention of computer

Southwest Technical Products 6800

Southwest Technical Products is founded by Daniel Meyer as DEMCO in the 1960s to provide a source for kit versions of projects published in electronics hobbyist magazines. SWTPC introduces many computer kits based on the Motorola 6800, and later, the 6809. Of the dozens of different SWTP kits available, the 6800 proved the most popular.

Tandem Computers releases the Tandem-16

essay about invention of computer

Dual-processor Tandem 16 system

Tailored for online transaction processing, the Tandem-16 is one of the first commercial fault-tolerant computers. The banking industry rushed to adopt the machine, built to run during repair or expansion. The Tandem-16 eventually led to the “Non-Stop” series of systems, which were used for early ATMs and to monitor stock trades.

VDM prototype built

essay about invention of computer

The Video Display Module (VDM)

The Video Display Module (VDM) marks the first implementation of a memory-mapped alphanumeric video display for personal computers. Introduced at the Altair Convention in Albuquerque in March 1976, the visual display module enabled the use of personal computers for interactive games.

Cray-1 supercomputer introduced

essay about invention of computer

Cray I 'Self-portrait'

The fastest machine of its day, The Cray-1's speed comes partly from its shape, a "C," which reduces the length of wires and thus the time signals need to travel across them. High packaging density of integrated circuits and a novel Freon cooling system also contributed to its speed. Each Cray-1 took a full year to assemble and test and cost about $10 million. Typical applications included US national defense work, including the design and simulation of nuclear weapons, and weather forecasting.

Intel 8080 and Zilog Z-80

essay about invention of computer

Zilgo Z-80 microprocessor

Image by Gennadiy Shvets

Intel and Zilog introduced new microprocessors. Five times faster than its predecessor, the 8008, the Intel 8080 could address four times as many bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program written for the 8080 and included twice as many built-in machine instructions.

Steve Wozniak completes the Apple-1

essay about invention of computer

Designed by Sunnyvale, California native Steve Wozniak, and marketed by his friend Steve Jobs, the Apple-1 is a single-board computer for hobbyists. With an order for 50 assembled systems from Mountain View, California computer store The Byte Shop in hand, the pair started a new company, naming it Apple Computer, Inc. In all, about 200 of the boards were sold before Apple announced the follow-on Apple II a year later as a ready-to-use computer for consumers, a model which sold in the millions for nearly two decades.

Apple II introduced

essay about invention of computer

Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout , the Apple-II finds popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.

Tandy Radio Shack introduces its TRS-80

essay about invention of computer

Performing far better than the company projections of 3,000 units for the first year, in the first month after its release Tandy Radio Shack´s first desktop computer — the TRS-80 — sells 10,000 units. The TRS-80 was priced at $599.95, included a Z80 microprocessor, video display, 4 KB of memory, a built-in BASIC programming language interpreter, cassette storage, and easy-to-understand manuals that assumed no prior knowledge on the part of the user. The TRS-80 proved popular with schools, as well as for home use. The TRS-80 line of computers later included color, portable, and handheld versions before being discontinued in the early 1990s.

The Commodore PET (Personal Electronic Transactor) introduced

essay about invention of computer

Commodore PET

The first of several personal computers released in 1977, the PET comes fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a membrane keyboard. The PET was popular with schools and for use as a home computer. It used a MOS Technologies 6502 microprocessor running at 1 MHz. After the success of the PET, Commodore remained a major player in the personal computer market into the 1990s.

The DEC VAX introduced

essay about invention of computer

DEC VAX 11/780

Beginning with the VAX-11/780, the Digital Equipment Corporation (DEC) VAX family of computers rivals much more expensive mainframe computers in performance and features the ability to address over 4 GB of virtual memory, hundreds of times the capacity of most minicomputers. Called a “complex instruction set computer,” VAX systems were backward compatible and so preserved the investment owners of previous DEC computers had in software. The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research.

Atari introduces its Model 400 and 800 computers

essay about invention of computer

Early Atari 400/800 advertisement

Shortly after delivery of the Atari VCS game console, Atari designs two microcomputers with game capabilities: the Model 400 and Model 800. The 400 served primarily as a game console, while the 800 was more of a home computer. Both faced strong competition from the Apple II, Commodore PET, and TRS-80 computers. Atari's 8-bit computers were influential in the arts, especially in the emerging DemoScene culture of the 1980s and '90s.

Motorola introduces the 68000 microprocessor

essay about invention of computer

Die shot of Motorola 68000

Image by Pauli Rautakorpi

The Motorola 68000 microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering.

Texas Instruments TI 99/4 is released

essay about invention of computer

Texas Instruments TI 99/4 microcomputer

Based around the Texas Instruments TMS 9900 microprocessor running at 3 MHz, the TI 99/4 has one of the fastest CPUs available in a home computer. The TI99/4 had a wide variety of expansion boards, with an especially popular speech synthesis system that could also be used with TI's Speak & Spell educational game. The TI 99/4 sold well and led to a series of TI follow-on machines.

Commodore introduces the VIC-20

essay about invention of computer

Commodore VIC-20

Commodore releases the VIC-20 home computer as the successor to the Commodore PET personal computer. Intended to be a less expensive alternative to the PET, the VIC-20 was highly successful, becoming the first computer to sell more than a million units. Commodore even used Star Trek television star William Shatner in advertisements.

The Sinclair ZX80 introduced

essay about invention of computer

Sinclair ZX80

This very small home computer is available in the UK as a kit for £79 or pre-assembled for £99. Inside was a Z80 microprocessor and a built-in BASIC language interpreter. Output was displayed on the user’s home TV screen through use of an adapter. About 50,000 were sold in Britain, primarily to hobbyists, and initially there was a long waiting list for the system.

The Computer Programme debuts on the BBC

essay about invention of computer

Title card- BBC’s The Computer Programme

The British Broadcasting Corporation’s Computer Literacy Project hoped “to introduce interested adults to the world of computers.” Acorn produces a popular computer, the BBC Microcomputer System, so viewers at home could follow along on their own home computers as they watched the program. The machine was expandable, with ports for cassette storage, serial interface and rudimentary networking. A large amount of software was created for the “BBC Micro,” including educational, productivity, and game programs.

Apollo Computer unveils its first workstation, its DN100

essay about invention of computer

Apollo DN100

The DN100 is based on the Motorola 68000 microprocessor, high-resolution display and built-in networking - the three basic features of all workstations. Apollo and its main competitor, Sun Microsystems, optimized their machines to run the computer-intensive graphics programs common in engineering and scientific applications. Apollo was a leading innovator in the workstation field for more than a decade, and was acquired by Hewlett-Packard in 1989.

IBM introduces its Personal Computer (PC)

essay about invention of computer

IBM's brand recognition, along with a massive marketing campaign, ignites the fast growth of the personal computer market with the announcement of its own personal computer (PC). The first IBM PC, formally known as the IBM Model 5150, was based on a 4.77 MHz Intel 8088 microprocessor and used Microsoft´s MS-DOS operating system. The IBM PC revolutionized business computing by becoming the first PC to gain widespread adoption by industry. The IBM PC was widely copied (“cloned”) and led to the creation of a vast “ecosystem” of software, peripherals, and other commodities for use with the platform.

Osborne 1 introduced

essay about invention of computer

Weighing 24 pounds and costing $1,795, the Osborne 1 is the first mass-produced portable computer. Its price was especially attractive as the computer included very useful productivity software worth about $1,500 alone. It featured a 5-inch display, 64 KB of memory, a modem, and two 5.25-inch floppy disk drives.

Commodore introduces the Commodore 64

essay about invention of computer

Commodore 64 system

The C64, as it is better known, sells for $595, comes with 64 KB of RAM and features impressive graphics. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in 1993, it had sold more than 22 million units. It is recognized by the 2006 Guinness Book of World Records as the greatest selling single computer of all time.

Franklin releases Apple II “clones”

essay about invention of computer

Franklin Ace 100 microcomputer

Created almost five years after the original Apple II, Franklin's Ace 1000 main logic board is nearly identical to that in the Apple II+ computer, and other models were later cloned as well. Franklin was able to undercut Apple's pricing even while offering some features not available on the original. Initially, Franklin won a court victory allowing them to continue cloning the machines, but in 1988, Apple won a copyright lawsuit against Franklin, forcing them to stop making Apple II “clones.”

Sun Microsystems is founded

essay about invention of computer

Sun-1 workstation

When Xerox PARC loaned the Stanford Engineering Department an entire Alto Ethernet network with laser printer, graduate student Andy Bechtolsheim re-designed it into a prototype that he then attached to Stanford’s computer network. Sun Microsystems grows out of this prototype. The roots of the company’s name came from the acronym for Stanford University Network (SUN). The company was incorporated by three 26-year-old Stanford alumni: Bechtolsheim, Vinod Khosla and Scott McNealy. The trio soon attracted UC Berkeley UNIX guru Bill Joy, who led software development. Sun helped cement the model of a workstation having an Ethernet interface as well as high-resolution graphics and the UNIX operating system.

Apple introduces the Lisa computer

essay about invention of computer

Lisa is the first commercial personal computer with a graphical user interface (GUI). It was thus an important milestone in computing as soon Microsoft Windows and the Apple Macintosh would soon adopt the GUI as their user interface, making it the new paradigm for personal computing. The Lisa ran on a Motorola 68000 microprocessor and came equipped with 1 MB of RAM, a 12-inch black-and-white monitor, dual 5.25-inch floppy disk drives and a 5 MB “Profile” hard drive. Lisa itself, and especially its GUI, were inspired by earlier work at the Xerox Palo Alto Research Center.

Compaq Computer Corporation introduces the Compaq Portable

essay about invention of computer

Compaq Portable

Advertised as the first 100% IBM PC-compatible computer, the Compaq Portable can run the same software as the IBM PC. With the success of the clone, Compaq recorded first-year sales of $111 million, the most ever by an American business in a single year. The success of the Portable inspired many other early IBM-compatible computers. Compaq licensed the MS-DOS operating system from Microsoft and legally reverse-engineered IBM’s BIOS software. Compaq's success launched a market for IBM-compatible computers that by 1996 had achieved an 83-percent share of the personal computer market.

Apple Computer launches the Macintosh

essay about invention of computer

Apple Macintosh

Apple introduces the Macintosh with a television commercial during the 1984 Super Bowl, which plays on the theme of totalitarianism in George Orwell´s book 1984 . The ad featured the destruction of “Big Brother” – a veiled reference to IBM -- through the power of personal computing found in a Macintosh. The Macintosh was the first successful mouse-driven computer with a graphical user interface and was based on the Motorola 68000 microprocessor. Its price was $2,500. Applications that came as part of the package included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG (What You See Is What You Get) word processing.

IBM releases its PC Jr. and PC/AT

essay about invention of computer

The PC Jr. is marketed as a home computer but is too expensive and limited in performance to compete with many of the other machines in that market. It’s “chiclet” keyboard was also criticized for poor ergonomics. While the PC Jr. sold poorly, the PC/AT sold in the millions. It offered increased performance and storage capacity over the original IBM PC and sold for about $4,000. It also included more memory and accommodated high-density 1.2-megabyte 5 1/4-inch floppy disks.

PC's Limited is founded

essay about invention of computer

PC’s Limited founder Michael Dell

In 1984, Michael Dell creates PC's Limited while still a student of the University of Texas at Austin. The dorm-room headquartered company sold IBM PC-compatible computers built from stock components. Dell dropped out of school to focus on his business and in 1985, the company produced the first computer of its own design, the Turbo PC, which sold for $795. By the early 1990s, Dell became one of the leading computer retailers.

The Amiga 1000 is released

essay about invention of computer

Music composition on the Amiga 1000

Commodore’s Amiga 1000 is announced with a major event at New York's Lincoln Center featuring celebrities like Andy Warhol and Debbie Harry of the musical group Blondie. The Amiga sold for $1,295 (without monitor) and had audio and video capabilities beyond those found in most other personal computers. It developed a very loyal following while add-on components allowed it to be upgraded easily. The inside of the Amiga case is engraved with the signatures of the Amiga designers, including Jay Miner as well as the paw print of his dog Mitchy.

Compaq introduces the Deskpro 386 system

essay about invention of computer

Promotional shot of the Compaq Deskpro 386s,

Compaq beats IBM to the market when it announces the Deskpro 386, the first computer on the market to use Intel´s new 80386 chip, a 32-bit microprocessor with 275,000 transistors on each chip. At 4 million operations per second and 4 kilobytes of memory, the 80386 gave PCs as much speed and power as older mainframes and minicomputers.

The 386 chip brought with it the introduction of a 32-bit architecture, a significant improvement over the 16-bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology. The new chip made graphical operating environments for IBM PC and PC-compatible computers practical. The architecture that allowed Windows and IBM OS/2 has remained in subsequent chips.

IBM releases the first commercial RISC-based workstation

essay about invention of computer

Reduced instruction set computers (RISC) grow out of the observation that the simplest 20 percent of a computer´s instruction set does 80 percent of the work. The IBM PC-RT had 1 MB of RAM, a 1.2-megabyte floppy disk drive, and a 40 MB hard drive. It performed 2 million instructions per second, but other RISC-based computers worked significantly faster.

The Connection Machine is unveiled

essay about invention of computer

Connection Machine CM-1

Daniel Hillis of Thinking Machines Corporation moves artificial intelligence a step forward when he develops the controversial concept of massive parallelism in the Connection Machine CM-1. The machine used up to 65,536 one-bit processors and could complete several billion operations per second. Each processor had its own small memory linked with others through a flexible network that users altered by reprogramming rather than rewiring. The machine´s system of connections and switches let processors broadcast information and requests for help to other processors in a simulation of brain-like associative recall. Using this system, the machine could work faster than any other at the time on a problem that could be parceled out among the many processors.

Acorn Archimedes is released

essay about invention of computer

Acorn Archimedes microcomputer

Acorn's ARM RISC microprocessor is first used in the company's Archimedes computer system. One of Britain's leading computer companies, Acorn continued the Archimedes line, which grew to nearly twenty different models, into the 1990s. Acorn spun off ARM as its own company to license microprocessor designs, which in turn has transformed mobile computing with ARM’s low power, high-performance processors and systems-on-chip (SoC).

IBM introduces its Personal System/2 (PS/2) machines

essay about invention of computer

The first IBM system to include Intel´s 80386 chip, the company ships more than 1 million units by the end of the first year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBM PCs for the first time. Many credit the PS/2 for making the 3.5-inch floppy disk drive and video graphics array (VGA) standard for IBM computers. The system was IBM's response to losing control of the PC market with the rise of widespread copying of the original IBM PC design by “clone” makers.

Apple co-founder Steve Jobs unveils the NeXT Cube

essay about invention of computer

Steve Jobs, forced out of Apple in 1985, founds a new company – NeXT. The computer he created, an all-black cube was an important innovation. The NeXT had three Motorola microprocessors and 8 MB of RAM. Its base price was $6,500. Some of its other innovations were the inclusion of a magneto-optical (MO) disk drive, a digital signal processor and the NeXTSTEP programming environment (later released as OPENSTEP). This object-oriented multitasking operating system was groundbreaking in its ability to foster rapid development of software applications. OPENSTEP was used as one of the foundations for the new Mac OS operating system soon after NeXT was acquired by Apple in 1996.

Laser 128 is released

essay about invention of computer

Laser 128 Apple II clone

VTech, founded in Hong Kong, had been a manufacturer of Pong-like games and educational toys when they introduce the Laser 128 computer. Instead of simply copying the basic input output system (BIOS) of the Apple II as Franklin Computer had done, they reversed engineered the system and sold it for US $479, a much lower price than the comparable Apple II. While Apple sued to remove the Laser 128 from the market, they were unsuccessful and the Laser remained one of the very few Apple “clones” for sale.

Intel introduces the 80486 microprocessor

essay about invention of computer

Intel 80486 promotional photo

Intel released the 80486 microprocessor and the i860 RISC/coprocessor chip, each of which contained more than 1 million transistors. The RISC microprocessor had a 32-bit integer arithmetic and logic unit (the part of the CPU that performs operations such as addition and subtraction), a 64-bit floating-point unit, and a clock rate of 33 MHz.

The 486 chips remained similar in structure to their predecessors, the 386 chips. What set the 486 apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the 386 without increasing the clock rate.

Macintosh Portable is introduced

essay about invention of computer

Macintosh Portable

Apple had initially included a handle in their Macintosh computers to encourage users to take their Macs on the go, though not until five years after the initial introduction does Apple introduce a true portable computer. The Macintosh Portable was heavy, weighing sixteen pounds, and expensive (US$6,500). Sales were weaker than projected, despite being widely praised by the press for its active matrix display, removable trackball, and high performance. The line was discontinued less than two years later.

Intel's Touchstone Delta supercomputer system comes online

essay about invention of computer

Intel Touchstone Delta supercomputer

Reaching 32 gigaflops (32 billion floating point operations per second), Intel’s Touchstone Delta has 512 processors operating independently, arranged in a two-dimensional communications “mesh.” Caltech researchers used this supercomputer prototype for projects such as real-time processing of satellite images, and for simulating molecular models in AIDS research. It would serve as the model for several other significant multi-processor systems that would be among the fastest in the world.

Babbage's Difference Engine #2 is completed

essay about invention of computer

The Difference Engine #2 at the Science Museum, London

Based on Charles Babbage's second design for a mechanical calculating engine, a team at the Science Museum in London sets out to prove that the design would have worked as planned. Led by curator Doron Swade the team built Babbage’s machine in six years, using techniques that would have been available to Babbage at the time, proving that Babbage’s design was accurate and that it could have been built in his day.

PowerBook series of laptops is introduced

essay about invention of computer

PowerBook 100 laptop computer

Apple's Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple's line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006.

DEC announces Alpha chip architecture

essay about invention of computer

DEC Alpha chip die-shot

Designed to replace the 32-bit VAX architecture, the Alpha is a 64-bit reduced instruction set computer (RISC) microprocessor. It was widely used in DEC's workstations and servers, as well as several supercomputers like the Chinese Sunway Blue Light system, and the Swiss Gigabooster. The Alpha processor designs were eventually acquired by Compaq, which, along with Intel, phased out the Alpha architecture in favor of the HP/Itanium microprocessor.

Intel Paragon is operational

essay about invention of computer

Intel Paragon system

Based on the Touchstone Delta computer Intel had built at Caltech, the Paragon is a parallel supercomputer that uses 2,048 (later increased to more than four thousand) Intel i860 processors. More than one hundred Paragons were installed over the lifetime of the system, each costing as much as five million dollars. The Paragon at Caltech was named the fastest supercomputer in the world in 1992. Paragon systems were used in many scientific areas, including atmospheric and oceanic flow studies, and energy research.

Apple ships the first Newton

essay about invention of computer

The Apple Newton Personal Digital Assistant

Apple enters the handheld computer market with the Newton. Dubbed a “Personal Digital Assistant” by Apple President John Sculley in 1992, the Newton featured many of the features that would define handheld computers in the following decades. The handwriting recognition software was much maligned for inaccuracy. The Newton line never performed as well as hoped and was discontinued in 1998.

Intel's Pentium microprocessor is released

essay about invention of computer

HP Netserver LM, one of the first to use Intel's Pentium

The Pentium is the fifth generation of the ‘x86’ line of microprocessors from Intel, the basis for the IBM PC and its clones. The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.

RISC PC is released

essay about invention of computer

Acorn RISC PC

Replacing their Archimedes computer, the RISC PC from UK's Acorn Computers uses the ARMv3 RISC microprocessor. Though it used a proprietary operating system, RISC OS, the RISC PC could run PC-compatible software using the Acorn PC Card. The RISC PC was used widely in UK broadcast television and in music production.

BeBox is released

essay about invention of computer

BeBox computer

Be, founded by former Apple executive Jean Louis Gassée and a number of former Apple, NeXT and SUN employees, releases their only product – the BeBox. Using dual PowerPC 603 CPUs, and featuring a large variety of peripheral ports, the first devices were used for software development. While it did not sell well, the operating system, Be OS, retained a loyal following even after Be stopped producing hardware in 1997 after less than 2,000 machines were produced.

IBM releases the ThinkPad 701C

essay about invention of computer

IBM ThinkPad 701C

Officially known as the Track Write, the automatically expanding full-sized keyboard used by the ThinkPad 701 is designed by inventor John Karidis. The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened -- resulting in a keyboard significantly wider than the case. This keyboard design was dubbed “the Butterfly.” The need for such a design was lessened as laptop screens grew wider.

Palm Pilot is introduced

essay about invention of computer

Ed Colligan, Donna Dubinsky, and Jeff Hawkins

Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, originally created software for the Casio Zoomer personal data assistant. The first generation of Palm-produced devices, the Palm 1000 and 5000, are based around a Motorola microprocessor running at 16MHz, and uses a special gestural input language called “Graffiti,” which is quick to learn and fast. Palm could be connected to a PC or Mac using a serial port to synchronize – “sync” – both computer and Palm. The company called it a ‘connected organizer’ rather than a PDA to emphasize this ability.

Sony Vaio series is begun

essay about invention of computer

Sony Vaio laptop

Sony had manufactured and sold computers in Japan, but the VAIO signals their entry into the global computer market. The first VAIO, a desktop computer, featured an additional 3D interface on top of the Windows 95 operating system as a way of attracting new users. The VAIO line of computers would be best known for laptops were designed with communications and audio-video capabilities at the forefront, including innovative designs that incorporated TV and radio tuners, web cameras, and handwriting recognition. The line was discontinued in 2014.

ASCI Red is operational

essay about invention of computer

ASCI Red supercomputers

The Advanced Strategic Computing Initiative (ASCI) needed a supercomputer to help with the maintenance of the US nuclear arsenal following the ban on underground nuclear testing. The ASCI Red, based on the design of the Intel Paragon, was built by IBM and delivered to Sandia National Laboratories. Until the year 2000, it was the world's fastest supercomputer, able to achieve peak performance of 1.3 teraflops, (about 1.3 trillion calculations per second).

Linux-based Supercomputing

essay about invention of computer

Linux Supercomputer

The first supercomputer using the Linux operating system, consumer, off-the shelf parts, and a high-speed, low-latency interconnection network, was developed by David A. Bader while at the University of New Mexico. From this successful prototype design, Bader led the development of “RoadRunner”, the first Linux supercomputer for open use by the national science and engineering community via the National Science Foundation's National Technology Grid. RoadRunner was put into production use in April 1999. Within a decade this design became the predominant architecture for all major supercomputers in the world.

The iMac, a range of all-in-one Macintosh desktop computers, is launched

essay about invention of computer

iMac poster

Apple makes a splash with its Bondi Blue iMac, which sells for about $1,300. Customers got a machine with a 233-MHz G3 processor, 4GB hard drive, 32MB of RAM, a CD-ROM drive, and a 15" monitor. The machine was noted for its ease-of-use and included a 'manual' that contained only a few pictures and less than 20 words. As Apple’s first new product under the leadership of a returning Steve Jobs, many consider this the most significant step in Apple's return from near-bankruptcy in the middle 1990s.

First camera phone introduced

essay about invention of computer

Sony-built J-Phone J-SH04

Japan's SoftBank introduces the first camera phone, the J-Phone J-SH04; a Sharp-manufactured digital phone with integrated camera. The camera had a maximum resolution of 0.11 megapixels a 256-color display, and photos could be shared wirelessly. The J-Phone line would quickly expand, releasing a flip-phone version just a month later. Cameras would become a significant part of most phones within a year, and several countries have even passed laws regulating their use.

Earth Simulator is world's fastest supercomputer

essay about invention of computer

Earth Simulator Supercomputer

Developed by the Japanese government to create global climate models, the Earth Simulator is a massively parallel, vector-based system that costs nearly 60 billion yen (roughly $600 million at the time). A consortium of aerospace, energy, and marine science agencies undertook the project, and the system was built by NEC around their SX-6 architecture. To protect it from earthquakes, the building housing it was built using a seismic isolation system that used rubber supports. The Earth Simulator was listed as the fastest supercomputer in the world from 2002 to 2004.

Handspring Treo is released

essay about invention of computer

Colligan, Dubinsky, Hawkins (left to right)

Leaving Palm Inc., Ed Colligan, Donna Dubinsky, and Jeff Hawkins found Handspring. After retiring their initial Visor series of PDAs, Handspring introduced the Treo line of smartphones, designed with built-in keyboards, cameras, and the Palm operating system. The Treo sold well, and the line continued until Handspring was purchased by Palm in 2003.

PowerMac G5 is released

essay about invention of computer

PowerMac G5 tower computer

With a distinctive anodized aluminum case, and hailed as the first true 64-bit personal computer, the Apple G5 is the most powerful Macintosh ever released to that point. While larger than the previous G4 towers, the G5 had comparatively limited space for expansion. Virginia Tech used more than a thousand PowerMac G5s to create the System X cluster supercomputer, rated #3 in November of that year on the world’s TOP500 fastest computers.

essay about invention of computer

Arduino starter kit

Harkening back to the hobbyist era of personal computing in the 1970s, Arduino begins as a project of the Interaction Design Institute, Ivrea, Italy. Each credit card-sized Arduino board consisted of an inexpensive microcontroller and signal connectors which made Arduinos ideal for use in any application connecting to or monitoring the outside world. The Arduino used a Java-based integrated development environment and users could access a library of programs, called “Wiring,” that allowed for simplified programming. Arduino soon became the main computer platform of the worldwide “Maker” movement.

Lenovo acquires IBM's PC business

essay about invention of computer

IBM and Lenovo logos

Nearly a quarter century after IBM launched their PC in 1981, they had become merely another player in a crowded marketplace. Lenovo, China's largest manufacturer of PCs, purchased IBM's personal computer business in 2005, largely to gain access to IBM's ThinkPad line of computers and sales force. Lenovo became the largest manufacturer of PCs in the world with the acquisition, later also acquiring IBM's server line of computers.

NASA Ames Research Center supercomputer Columbia

essay about invention of computer

Columbia Supercomputer system made up of SGI Altix

Named in honor of the space shuttle which broke-up on re-entry, the Columbia supercomputer is an important part of NASA's return to manned spaceflight after the 2003 disaster. Columbia was used in space vehicle analysis, including studying the Columbia disaster, but also in astrophysics, weather and ocean modeling. At its introduction, it was listed as the second fastest supercomputer in the world and this single system increased NASA's supercomputing capacity 10-fold. The system was kept at NASA Ames Research Center until 2013, when it was removed to make way for two new supercomputers.

One Laptop Per Child initiative begins

essay about invention of computer

OLPC XO laptop computer

At the 2006 World Economic Forum in Davos, Switzerland, the United Nations Development Program (UNDP) announces it will create a program to deliver technology and resources to targeted schools in the least developed countries. The project became the One Laptop per Child Consortium (OLPC) founded by Nicholas Negroponte, the founder of MIT's Media Lab. The first offering to the public required the buyer to purchase one to be given to a child in the developing world as a condition of acquiring a machine for themselves. By 2011, over 2.4 million laptops had been shipped.

The Amazon Kindle is released

essay about invention of computer

Amazon Kindle

Many companies have attempted to release electronic reading systems dating back to the early 1990s. Online retailer Amazon released the Kindle, one of the first to gain a large following among consumers. The first Kindle featured wireless access to content via Amazon.com, along with an SD card slot allowing increased storage. The first release proved so popular there was a long delay in delivering systems on release. Follow-on versions of the Kindle added further audio-video capabilities.

The Apple iPhone is released

essay about invention of computer

Apple iPhone

Apple launches the iPhone - a combination of web browser, music player and cell phone - which could download new functionality in the form of "apps" (applications) from the online Apple store. The touchscreen enabled smartphone also had built-in GPS navigation, high-definition camera, texting, calendar, voice dictation, and weather reports.

The MacBook Air is released

essay about invention of computer

Steve Jobs introducing MacBook Air

Apple introduces their first ultra notebook – a light, thin laptop with high-capacity battery. The Air incorporated many of the technologies that had been associated with Apple's MacBook line of laptops, including integrated camera, and Wi-Fi capabilities. To reduce its size, the traditional hard drive was replaced with a solid-state disk, the first mass-market computer to do so.

IBM's Roadrunner supercomputer is completed

essay about invention of computer

Computer-enhanced image of IBM’s Roadrunner

The Roadrunner is the first computer to reach a sustained performance of 1 petaflop (one thousand trillion floating point operations per second). It used two different microprocessors: an IBM POWER XCell L8i and AMD Opteron. It was used to model the decay of the US nuclear arsenal, analyze financial data, and render 3D medical images in real-time. An offshoot of the POWER XCell8i chip was used as the main processor in the Sony PlayStation 3 game console.

Jaguar Supercomputer at Oak Ridge upgraded

Originally a Cray XT3 system, the Jaguar is a massively parallel supercomputer at Oak Ridge National Laboratory, a US science and energy research facility. The system cost more than $100 million to create and ran a variation of the Linux operating system with up to 10 petabytes of storage. The Jaguar was used to study climate science, seismology, and astrophysics applications. It was the fastest computer in the world from November 2009 to June 2010.

Apple Retina Display

essay about invention of computer

Introduction of the iPhone 4 with retina display

Since the release of the Macintosh in 1984, Apple has placed emphasis on high-resolution graphics and display technologies. In 2012, Apple introduced the Retina display for the MacBook Pro laptop and iPad tablet. With a screen resolution of up to 400 pixels-per-inch (PPI), Retina displays approached the limit of pixel visibility to the human eye. The display also used In Plane Switching (IPS) technology, which allowed for a wider viewing angle and improved color accuracy. The Retina display became standard on most of the iPad, iPhone, MacBook, and Apple Watch product lines.

China's Tianhe supercomputers are operational

essay about invention of computer

Tianhe-1A Supercomputer

With a peak speed of over a petaflop (one thousand trillion calculations per second), the Tianhe 1 (translation: Milky Way 1) is developed by the Chinese National University of Defense Technology using Intel Xeon processors combined with AMD graphic processing units (GPUs). The upgraded and faster Tianhe-1A used Intel Xeon CPUs as well, but switched to nVidia's Tesla GPUs and added more than 2,000 Fei-Tang (SPARC-based) processors. The machines were used by the Chinese Academy of Sciences to run massive solar energy simulations, as well as some of the most complex molecular studies ever undertaken.

The Apple iPad is released

essay about invention of computer

Steve Jobs introducing the iPad

The iPad combines many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few.

IBM Sequoia is delivered to Lawrence Livermore Labs

Built by IBM using their Blue Gene/Q supercomputer architecture, the Sequoia system is the world's fastest supercomputer in 2012. Despite using 98,304 PowerPC chips, Sequoia's relatively low power usage made it unusually efficient. Scientific and defense applications included studies of human electrophysiology, nuclear weapon simulation, human genome mapping, and global climate change.

Nest Learning Thermostat is Introduced

essay about invention of computer

Nest Learning Thermostat

The Nest Learning Thermostat is an early product made for the emerging “Internet of Things,” which envisages a world in which common everyday devices have network connectivity and can exchange information or be controlled. The Nest allowed for remote access to a user’s home’s thermostat by using a smartphone or tablet and could also send monthly power consumption reports to help save on energy bills. The Nest would remember what temperature users preferred by ‘training’ itself to monitor daily use patterns for a few days then adopting that pattern as its new way of controlling home temperature.

Raspberry Pi, a credit-card-size single board computer, is released as a tool to promote science education

essay about invention of computer

Raspberry Pi computer

Conceived in the UK by the Raspberry Pi Foundation, this credit card-sized computer features ease of use and simplicity making it highly popular with students and hobbyists. In October 2013, the one millionth Raspberry Pi was shipped. Only one month later, another one million Raspberry Pis were delivered. The Pi weighed only 45 grams and initially sold for only $25-$35 U.S. Dollars.

University of Michigan Micro Mote is Completed

The University of Michigan Micro Mote (M3) is the smallest computer in the world at the time of its completion. Three types of the M3 were available – two types that measured either temperature or pressure and one that could take images. The motes were powered by a tiny battery and could gain light energy through a photocell, which was enough to feed the infinitesimally small amount of energy a mote consumes (1 picowatt). Motes are also known as “smart dust,” since the intention is that their tiny size and low cost make them inexpensive enough to “sprinkle” in the real world to as sensors. An ecologist, for example, could sprinkle thousands of motes from the air onto a field and measure soil and air temperature, moisture, and sunlight, giving them accurate real-time data about the environment.

Apple Watch

essay about invention of computer

Apple Store’s display of newly introduced Apple Watches

Building a computer into the watch form factor has been attempted many times but the release of the Apple Watch leads to a new level of excitement. Incorporating a version of Apple's iOS operating system, as well as sensors for environmental and health monitoring, the Apple Watch was designed to be incorporated into the Apple environment with compatibility with iPhones and Mac Books. Almost a million units were ordered on the day of release. The Watch was received with great enthusiasm, but critics took issue with the somewhat limited battery life and high price.

Exhibit Design and Development Team

Exhibit content team.

  • Random article
  • Teaching guide
  • Privacy & cookies

A model of a Babbage-style Difference Engine at the Computer History Museum. Photo by Cory Doctorow.

A brief history of computers

by Chris Woodford . Last updated: January 19, 2023.

C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same.

Read on to learn more about the history of computers—or take a look at our article on how computers work .

Photo: A model of one of the world's first computers (the Difference Engine invented by Charles Babbage) at the Computer History Museum in Mountain View, California, USA. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Cogs and Calculators

It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French scientist and philosopher Blaise Pascal (1623–1666) invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs ( gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register."

Apart from developing one of the world's earliest mechanical calculators, Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. [1] In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones. But, in the 19th century, these ideas were still far ahead of their time. It would take another 50–100 years for mathematicians and computer scientists to figure out how to use them (find out more in our articles about calculators and logic gates ).

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism. Picture courtesy of US Library of Congress .

Engines of Calculation

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. A calculator is a device that makes it quicker and easier for people to do sums—but it needs a human operator. A computer, on the other hand, is a machine that can operate automatically, without any human help, by following a series of stored instructions called a program (a kind of mathematical recipe). Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards and paper tape to store information and feed it into a machine. Here's a drawing from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how a strip of paper (yellow) is punched with different patterns of holes (orange) that correspond to statistics gathered about people in the US census. Picture courtesy of US Patent and Trademark Office.

The first person to attempt this was a rather obsessive, notoriously grumpy English mathematician named Charles Babbage (1791–1871). Many regard Babbage as the "father of the computer" because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers. During his lifetime, Babbage never completed a single one of the hugely ambitious machines that he tried to build. That was no surprise. Each of his programmable "engines" was designed to use tens of thousands of precision-made gears. It was like a pocket watch scaled up to the size of a steam engine , a Pascal or Leibniz machine magnified a thousand-fold in dimensions, ambition, and complexity. For a time, the British government financed Babbage—to the tune of £17,000, then an enormous sum. But when Babbage pressed the government for more money to build an even more advanced machine, they lost patience and pulled out. Babbage was more fortunate in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable—and this is why she is still, sometimes, referred to as the world's first computer programmer. [2] Little of Babbage's work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas. Unfortunately, by then, most of these ideas had already been reinvented by others.

Artwork: Charles Babbage (1791–1871). Picture from The Illustrated London News, 1871, courtesy of US Library of Congress .

Babbage had intended that his machine would take the drudgery out of repetitive calculations. Originally, he imagined it would be used by the army to compile the tables that helped their gunners to fire cannons more accurately. Toward the end of the 19th century, other inventors were more successful in their effort to construct "engines" of calculation. American statistician Herman Hollerith (1860–1929) built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data. Then, as now, a census was taken each decade but, by the 1880s, the population of the United States had grown so much through immigration that a full-scale analysis of the data by hand was taking seven and a half years. The statisticians soon figured out that, if trends continued, they would run out of time to compile one census before the next one fell due. Fortunately, Hollerith's tabulator was an amazing success: it tallied the entire census in only six weeks and completed the full analysis in just two and a half years. Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired its present name: International Business Machines (IBM).

Photo: Keeping count: Herman Hollerith's late-19th-century census machine (blue, left) could process 12 separate bits of statistical data each minute. Its compact 1940 replacement (red, right), invented by Eugene M. La Boiteaux of the Census Bureau, could work almost five times faster. Photo by Harris & Ewing courtesy of US Library of Congress .

Bush and the bomb

Photo: Dr Vannevar Bush (1890–1974). Picture by Harris & Ewing, courtesy of US Library of Congress .

The history of computing remembers colorful characters like Babbage, but others who played important—if supporting—roles are less well known. At the time when C-T-R was becoming IBM, the world's most powerful calculators were being developed by US government scientist Vannevar Bush (1890–1974). In 1925, Bush made the first of a series of unwieldy contraptions with equally cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200 miles) of wire and 150 electric motors . Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days of wheel cranking and belt turning before the results finally emerged.

Impressive machines like the Differential Analyzer were only one of several outstanding contributions Bush made to 20th-century technology. Another came as the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured out how electrical circuits could be linked together to process binary code with Boolean algebra (a way of comparing binary numbers using logic) and thus make simple decisions. During World War II, President Franklin D. Roosevelt appointed Bush chairman first of the US National Defense Research Committee and then director of the Office of Scientific Research and Development (OSRD). In this capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative that led to the creation of the atomic bomb. One of Bush's final wartime contributions was to sketch out, in 1945, an idea for a memory-storing and sharing device called Memex that would later inspire Tim Berners-Lee to invent the World Wide Web . [3] Few outside the world of computing remember Vannevar Bush today—but what a legacy! As a father of the digital computer, an overseer of the atom bomb, and an inspiration for the Web, Bush played a pivotal role in three of the 20th-century's most far-reaching technologies.

Photo: "A gigantic mechanical slide rule": A differential analyzer pictured in 1938. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

Turing—tested

The first modern computers.

The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room. [4] The following year, American physicist John Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a great advance—1000 times more accurate than Bush's Differential Analyzer. These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits (although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number). These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays (electrically operated magnets that automatically switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may have been, but relays suffered from several problems: they were large (that's why the Harvard Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).

Photo: An analog computer being used in military research in 1949. Picture courtesy of NASA on the Commons (where you can download a larger version.

Most of the machines developed around this time were intended for military purposes. Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10,000 scientists from the United States alone. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down.

On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians based at Bletchley Park near London, England (including Alan Turing) built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (also known, especially in Britain, as a valve). The vacuum tube, each one about as big as a person's thumb (earlier ones were very much bigger) and glowing red hot like a tiny electric light bulb, had been invented in 1906 by Lee de Forest (1873–1961), who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. [5] In computers such as the ABC and Colossus, vacuum tubes found an alternative use as faster and more compact switches.

Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: the Electronic Numerical Integrator And Calculator (ENIAC). The ENIAC's inventors, two scientists from the University of Pennsylvania, John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), were originally inspired by Bush's Differential Analyzer; years later Eckert recalled that ENIAC was the "descendant of Dr Bush's machine." But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job (code-breaking); since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

Photo: Sir Maurice Wilkes (left), his collaborator William Renwick, and the early EDSAC-1 electronic computer they built in Cambridge, pictured around 1947/8. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late 1940s. Working with a brilliant Hungarian mathematician, John von Neumann (1903–1957), who was based at Princeton University, they then designed a better machine called EDVAC (Electronic Discrete Variable Automatic Computer). In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. [6] After EDVAC, Eckert and Mauchly developed UNIVAC 1 (UNIVersal Automatic Computer) in 1951. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper (1906–1992), who had originally been employed by Howard Aiken on the Harvard Mark I. Like Herman Hollerith's tabulator over 50 years before, UNIVAC 1 was used for processing data from the US census. It was then manufactured for other users—and became the world's first large-scale commercial computer.

Machines like Colossus, the ENIAC, and the Harvard Mark I compete for significance and recognition in the minds of computer historians. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late 1930s and the early 1950s. Among those other machines were pioneering computers put together by English academics, notably the Manchester/Ferranti Mark I, built at Manchester University by Frederic Williams (1911–1977) and Thomas Kilburn (1921–2001), and the EDSAC (Electronic Delay Storage Automatic Calculator), built by Maurice Wilkes (1913–2010) at Cambridge University. [7]

Photo: Control panel of the UNIVAC 1, the world's first large-scale commercial computer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

The microelectronic revolution

Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug." Popular legend has it that this word entered the vocabulary of computer programmers sometime in the 1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000. The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of any other existing computing machine." But developing computers that were an order of magnitude more powerful still would have needed hundreds of thousands or even millions of vacuum tubes—which would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

The solution appeared in 1947 thanks to three physicists working at Bell Telephone Laboratories (Bell Labs). John Bardeen (1908–1991), Walter Brattain (1902–1987), and William Shockley (1910–1989) were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors (materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways) to make a better form of amplifier than the vacuum tube. When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December 1947, they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the 1956 Nobel Prize in Physics . By that time, however, the three men had already gone their separate ways. John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in 1972. Walter Brattain moved to another part of Bell Labs.

William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce (1927–1990) and research chemist Gordon Moore (1929–). It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In 1956, eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. [8]

It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby (1923–2005) was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit (IC) , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.

Photo: An integrated circuit from the 1980s. This is an EPROM chip (effectively a forerunner of flash memory , which you could only erase with a blast of ultraviolet light).

Mainframes, minis, and micros

Photo: An IBM 704 mainframe pictured at NASA in 1958. Designed by Gene Amdahl, this scientific number cruncher was the successor to the 701 and helped pave the way to arguably the most important IBM computer of all time, the System/360, which Amdahl also designed. Photo courtesy of NASA .

Photo: The control panel of DEC's classic 1965 PDP-8 minicomputer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Integrated circuits, as much as transistors, helped to shrink computers during the 1960s. In 1943, IBM boss Thomas Watson had reputedly quipped: "I think there is a world market for about five computers." Just two decades later, the company and its competitors had installed around 25,000 large computer systems across the United States. As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components.

The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. In 1968, Robert Noyce and Gordon Moore had left Fairchild to establish a new company of their own. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction. A couple of their engineers, Federico Faggin (1941–) and Marcian Edward (Ted) Hoff (1937–), realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all. Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution.

Personal computers

By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts . With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.

After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs (1955–2011), persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced "Apple Two"). While the Altair 8800 looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April 1977, it was the world's first easy-to-use home "microcomputer." Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET. [9]

Apple's success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. It didn't take a VisiCalc spreadsheet to figure out that, if the trend continued, upstarts like Apple would undermine IBM's immensely lucrative business market selling "Big Blue" computers. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company's fortunes and stole the market back from Apple.

The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine's hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.

In 1976, Gary Kildall (1942–1994), a teacher and computer scientist, and one of the founders of the Homebrew Computer Club, had figured out a solution to this problem. Kildall wrote an operating system (a computer's fundamental control software) called CP/M that acted as an intermediary between the user's programs and the machine's hardware. With a stroke of genius, Kildall realized that all he had to do was rewrite CP/M so it worked on each different machine. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system. Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world's greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates (1955–). His then tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. Some believe Microsoft and IBM cheated Kildall out of his place in computer history; Kildall himself accused them of copying his ideas. Others think Gates was simply the shrewder businessman. Either way, the IBM PC, powered by Microsoft's operating system, was a runaway success.

Yet IBM's victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, starting making IBM-compatible (or "cloned") hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success. [10]

Photo: Personal computers threatened companies making large "mainframes" like this one. Picture courtesy of NASA on the Commons (where you can download a larger version).

The user revolution

Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in text commands, the Alto had a desktop-like screen with little picture icons that could be moved around with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")—an idea conceived by Alan Kay (1940–) and now used in virtually every modern computer. The Alto borrowed some of its ideas, including the mouse , from 1960s computer pioneer Douglas Engelbart (1925–2013).

Photo: During the 1980s, computers started to converge on the same basic "look and feel," largely inspired by the work of pioneers like Alan Kay and Douglas Engelbart. Photographs in the Carol M. Highsmith Archive, courtesy of US Library of Congress , Prints and Photographs Division.

Back at Apple, Jobs launched his own version of the Alto project to develop an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. With a retail price of $10,000, over three times the cost of an IBM PC, the Lisa was a commercial flop. But it paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell's novel 1984 , and directed by Ridley Scott (director of the dystopic movie Blade Runner ), Apple took a swipe at IBM's monopoly, criticizing what it portrayed as the firm's domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple's ad promised a very different vision: "On January 24, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984'." The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM's position.

Ironically, Jobs' easy-to-use machine also helped Microsoft to dislodge IBM as the world's leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh "look and feel" in all present and future versions of Windows. Microsoft's Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.

Photo: The IBM Blue Gene/P supercomputer at Argonne National Laboratory: one of the world's most powerful computers. Picture courtesy of Argonne National Laboratory published on Wikimedia Commons in 2009 under a Creative Commons Licence .

From nets to the Internet

Standardized PCs running standardized software brought a big benefit for businesses: computers could be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob Metcalfe (1946–) developed a new way of linking computers "through the ether" (empty space) that he called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help companies realize "Metcalfe's Law": computers become useful the more closely connected they are to other people's computers. As more and more companies explored the power of local area networks (LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by connecting computers over even greater distances—into so-called wide area networks (WANs).

Photo: Computers aren't what they used to be: they're much less noticeable because they're much more seamlessly integrated into everyday life. Some are "embedded" into household gadgets like coffee makers or televisions . Others travel round in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading "apps" (applications).

Today, the best known WAN is the Internet —a global network of individual computers and LANs that links up hundreds of millions of people. The history of the Internet is another story, but it began in the 1960s when four American universities launched a project to connect their computer systems together to make the first WAN. Later, with funding for the Department of Defense, that network became a bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of networking gave British computer programmer Tim Berners-Lee (1955–) his big idea: to combine the power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945. Thus, was born the World Wide Web —an easy way of sharing information over a computer network, which made possible the modern age of cloud computing (where anyone can access vast computing power over the Internet without having to worry about where or how their data is processed). It's Tim Berners-Lee's invention that brings you this potted history of computing today!

And now where?

If you liked this article..., find out more, on this site.

  • Supercomputers : How do the world's most powerful computers work?

Other websites

There are lots of websites covering computer history. Here are a just a few favorites worth exploring!

  • The Computer History Museum : The website of the world's biggest computer museum in California.
  • The Computing Age : A BBC special report into computing past, present, and future.
  • Charles Babbage at the London Science Museum : Lots of information about Babbage and his extraordinary engines. [Archived via the Wayback Machine]
  • IBM History : Many fascinating online exhibits, as well as inside information about the part IBM inventors have played in wider computer history.
  • Wikipedia History of Computing Hardware : covers similar ground to this page.
  • Computer history images : A small but interesting selection of photos.
  • Transistorized! : The history of the invention of the transistor from PBS.
  • Intel Museum : The story of Intel's contributions to computing from the 1970s onward.

There are some superb computer history videos on YouTube and elsewhere; here are three good ones to start you off:

  • The Difference Engine : A great introduction to Babbage's Difference Engine from Doron Swade, one of the world's leading Babbage experts.
  • The ENIAC : A short Movietone news clip about the completion of the world's first programmable electronic computer.
  • A tour of the Computer History Museum : Dag Spicer gives us a tour of the world's most famous computer museum, in California.

For older readers

For younger readers.

Text copyright © Chris Woodford 2006, 2023. All rights reserved. Full copyright notice and terms of use .

Rate this page

Tell your friends, cite this page, more to explore on our website....

  • Get the book
  • Send feedback

essay about invention of computer

The history of computing is both evolution and revolution

essay about invention of computer

Head, Department of Computing & Information Systems, The University of Melbourne

Disclosure statement

Justin Zobel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Melbourne provides funding as a founding partner of The Conversation AU.

View all partners

This month marks the 60th anniversary of the first computer in an Australian university. The University of Melbourne took possession of the machine from CSIRO and on June 14, 1956, the recommissioned CSIRAC was formally switched on. Six decades on, our series Computing turns 60 looks at how things have changed.

It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.

For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.

Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.

The transformation that is due to computation is often described as digital disruption . But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.

Evolution and revolution

Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.

But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.

For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.

In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.

Getting connected

Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.

essay about invention of computer

Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.

This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.

As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.

The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.

The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.

This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.

With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.

The ‘superhighway’

It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “ information superhighway ”, before any major commercial or social uses of the internet had appeared.

essay about invention of computer

Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee , created a system for storing documents and publishing them to the internet, which they called the world wide web .

As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.

As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.

With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.

We all connect

The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.

There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.

This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.

The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.

Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.

Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.

The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.

  • Computer science
  • Computing turns 60

essay about invention of computer

Events and Communications Coordinator

essay about invention of computer

Assistant Editor - 1 year cadetship

essay about invention of computer

Executive Dean, Faculty of Health

essay about invention of computer

Lecturer/Senior Lecturer, Earth System Science (School of Science)

essay about invention of computer

Sydney Horizon Educators (Identified)

Logo for Clemson University Open Textbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Modern (1940’s-present)

66 History of Computers

Chandler Little and Ben Greene

Introduction

Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have shown more advancement in technology than any other invention. They have advanced almost every level of learning in our lives and looks like it will only keep impacting through the decades. Computers in today’s society have become a focal point for everyday life and will continue to do so for the foreseeable future. One important company that has shaped the computer industry is Apple, Inc., which was founded in the final quarter of the 20th century. This is one of the primary computer providers in American society, so a history of the company is included in this chapter to contribute specific information about a business that has propelled the growth of computer popularity.

The Evolution of Computers

Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage and was created with a series of vacuum tubes, ultimately weighing 700 pounds- far larger than computers today. Computers have vastly decreased in size due to the invention of the transistor in 1947, which revolutionized computing by replacing bulky vacuum tubes with

essay about invention of computer

smaller components that made computers more compact and also more reliable. This led to an era of rapid technological advancement, leading to the development of integrated circuits, microprocessors, and eventually the smaller, lighter personal computers that have become indispensable in modern society. For example, most laptops today weigh in the range of two to eight pounds. A picture of one of the first computers can be seen in Figure 1. There have been large amounts of movement in the data storage sector in computers. The very first hard drive was created in 1956 and had a capacity of 5MB and weighed in at 550 pounds.

Today hard drives are becoming smaller and we see them weighing a couple of ounces to a couple of pounds. As files have come more complex, the need for more space in computers has increased drastically. Today we see games take up to 100GB of storage. To give a reference as to how big of a difference

essay about invention of computer

5MB is to 100GB of storage is that 5MB is .005 GB. The hard drives we have today are seeing sizes from 10TB and larger. A TB is 1000GB. The evolution of the hard drive can be seen in figure 2. As the world of computers keeps progressing, there is a general concept of making them smaller, but at the same time seeing a generational step in improvement. The improvements allow the daily tasks of many users (like teachers, researchers, and doctors) to become shorter, making their tasks quicker and easier to accomplish. New software is also constantly being developed, and as a result, we are seeing strides in staying connected to others in the ways of social media, messaging platforms, and other means of communication. The downside to this growing dependence on computers is that  technological failures/damages can create major setbacks on any given day.

relation to sts

The development of computers as a prevalent form of technology has had a profound impact on society as a whole in the United States. Computers are now ubiquitous, playing a crucial role in various aspects of everyday life. They are utilized in most classrooms across the country, facilitating learning and enhancing educational experiences for students of all ages. In the workplace, computers have revolutionized business operations, streamlining processes, increasing efficiency, and enabling remote work capabilities. Additionally, computers have become indispensable tools in households across America. According to the U.S. Census Bureau’s American Community Survey, a staggering 92% of households reported having at least one type of computer (2021). This statistic underscores the widespread integration of computers into the fabric of American life. The impact of computers extends beyond mere accessibility. They have transformed communication, allowing people to connect instantaneously through email, social media, and video conferencing platforms. Additionally, computers have revolutionized entertainment, providing access to a vast array of digital content, from streaming services to video games.

Overall, the pervasive presence of computers underscores their monumental impact on Americans’ lives, shaping how we learn, work, communicate, and entertain ourselves. As technology continues to evolve, the influence of computers on society is expected to grow even further, driving transformative changes across various domains.

missing Voices in computer history

The evolution of computers has been happening at a fast rate, and when this happens people’s contributions are left out. The main demographic in computers that are left out are women. Grace Hopper is one of the most influential people in the computer spectrum, but her work is not shown in the classroom. In the 1950s, Grace Hopper was a senior mathematician on her team for UNIVAC (UNIVersal Automatic  Computing INC). Here she created the very first compiler (Cassel, 2016). This was a massive accomplishment for anyone in the field of computing because it allowed the idea that programming languages are not tied to a specific computer, but can be used on any computer. This single feature in computers was one of the main driving forces for computing to become so robust and powerful that it is today. Grace Hopper’s work needs to be talked about in classrooms not only just in engineering courses, but as well as general classes. Students need to hear that a woman was the driving force behind the evolution of computing. By talking about this, it may urge more women to join the computing field because right now only 25% of jobs in the computing sector are held by women (Cassel, 2016). With a more diverse workforce in computing, we can see the creation of new ideas and features that were never thought of before.

During the evolution of computers, many people have been left out with their creation with respect to the development and algorithms. With the push to gender equality in the world in future years, this gap between the disparity between women’s credibility and men’s credibility will be shrunk to a negligible amount. As computers continue to evolve the world of STS will need to evolve with them to adapt to the changes in technology. If not, some of the great creations in the computer sector will be neglected, and most notoriously here is VR (Virtual Reality) with its higher entry-level price and motion sickness that comes along with VR (Virtual Reality).

history of apple, inc.

In American society today, two primary operating systems dominate: Windows and MacOS. MacOS is the operating system for Apple computers, which were estimated to cover about 16.1% of the U.S. personal computer market in the fourth quarter of 2023 according to a study from Gartner (2024). The company Apple Inc. was founded on April 1, 1976, by Steve Jobs and Steve Wozniak who wanted to make computers more user-friendly and accessible to individuals. Their vision was to revolutionize the computer industry. They started by building the Apple I in Jobs’ garage, leading to the introduction of the Apple II, which featured color graphics and propelled the company’s growth. However, internal conflicts and the departure of key figures like Jobs and Wozniak led to a period of struggle in the 1980s and early 1990s. Jobs returned to Apple in 1997 and initiated transformative changes, including an alliance with Microsoft and the launch of groundbreaking products like the iBook and iPod. Apple continued to expand their product line, with the introduction of the iPhone in 2007 marking a new era of success for Apple, propelling it to become the second most valuable company in the world. Apple has been able to maintain a strong position in the technology market for this entire period by continuously improving its beginning product, the Macintosh computer, and by adapting to new technological changes.

Image of one of the original Apple computers

Apple’s Macintosh computers have changed quite a lot throughout the company’s history. The Macintosh 128k (figure 3) was the very first Apple computer, released on January 24, 1984. It had a 9-inch black and white display with 128KB of RAM (computer memory) and operated on MacOS 1.0. The next important release was in April 1995 with several variations of the Macintosh Performa which had 500MB to 1 GB of memory. Interestingly, the multiple models of this computer ended up competing with each other and were discontinued. This led to the iMac G3 in August 1998 which sported a futuristic design with multiple color options for the back of the computer as well as USB ports, 4GB of memory, and built-in speakers. iMac G3 was the beginning of MacOS 8. In 2007, the iMac went through a major redesign with melded glass and aluminum as the material and a widescreen display. The newer Mac’s continue to be built slimmer, with faster processors, better displays, and more storage (Mingis and Moreau, 2021).

Missing voices within apple

In a male-dominant field, it’s very possible for women’s impacts to be drowned out in technological evolutions. Within Apple’s business specifically, several women have made a large difference in their progression. Susan Kare for example, was the first designer of Apple’s icons like the stopwatch and paintbrush that helped Apple establish the Mac. Another woman with a large contribution to Apple was Joanne Hoffman. She was the fifth person to join Macintosh in 1980 and “wrote the first draft of the User Interface Guidelines for the Mac and figured out how to pitch the computer at the education markets” in the beginning of Apple’s existence (Evans, 2016).

Throughout this chapter, the importance of computers as a catalyst for advancement in our society is evident. Clearly computers have evolved from their inception up until this point today in multiple different ways, and several people from several different backgrounds have played an important part in this.

How has the advancement in technology improved your life?

A brief history of computers – unipi.it . (n.d.). Retrieved November 7, 2022, from https://digitaltools.labcd.unipi.it/wp-content/uploads/2021/05/A-brief-history-of-computers.pdf

Cassel, L.  (December 15, 2016 ). “Op-Ed: 25 Years After Computing Pioneer Grace Hopper’ s Death, We Still Have Work to Do”. USNEWS.com, Thursday. Accessed via Nexis Uni database from Clemson University.

Evans, J. (2016, March 8). 10 Women Who Made Apple Great. Computerworld. https://www.computerworld.com/article/3041874/10-women-who-made-apple-great.html

Gartner. (2024, January 10). Gartner Says Worldwide PC Shipments Increased 0.3% in Fourth Quarter of 2023 but Declined 14.8% for the Year. Gartner. https://www.gartner.com/en/newsroom/press-releases/01-10-2024-gartner-says-worldwide-pc-shipments-increased-zero-point-three-percent-in-fourth-quarter-of-2023-but-declined-fourteen-point-eight-percent-for-the-year#:~:text=HP%20maintained%20the%20top%20spot,share%20(see%20Table%202).&text=HP%20Inc.,-4%2C665

Kleiman, K. & Saklayen, N. (2018, April 19). These 6 pioneering women helped create modern computer s. ideas.ted.com. Retrieved September 26, 2021, from https://ideas.ted.com/how-i-discovered-six-pioneering-women-who-helped-create-modern-computers-and-why-we-should-never-forget-them / .

Mingis, K. and Moreau, S. (2021, April 28). The Evolution of the Macintosh – and the Imac. Computerworld. https://www.computerworld.com/article/1617841/evolution-of-macintosh-and-imac.html

Richardson, A. (2023, April). The Founding of Apple Computer, Inc. Library of Congress. https://guides.loc.gov/this-month-in-business-history/april/apple-computer-founded

Thompson, C. (2019, June 1). The gendered history of human computers. Smithsonian.com. Retrieved September 26, 2021, from https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/ .

Women in Computing and Women in Engineering honored for promoting girls in STEM.  (May 26, 2017 Friday). US Official News. Accessed via Nexis Uni database from Clemson University.

Zimmermann, K. A. (2017, September 7). History of computers: A brief timeline. LiveScience. Retrieved September 26, 2021, from https://www.livescience.com/20718-computer-history.html .

“Gene Amdahl’s first computer.” by Erik Pitti is licensed under CC BY 2.0

“First hard drives” by gabrielsaldana is licensed under CC BY 2.0

 Sailko. (2017). Neo Preistoria Exhibition (Milan 2016). Wikipedia. https://en.wikipedia.org/wiki/Macintosh_128K#/media/File:Computer_macintosh_128k,_1984_(all_about_Apple_onlus).jpg

AI ACKNOWLEDGMENT

I acknowledge the use of ChatGPT to generate additional content for this chapter.

Prompts and uses:

I entered the following prompt: Summarize the history of Apple in 7 sentences based on that prompt [the Library of Congress article].

Use: I modified the output to add more information from the article that I found relevant. I also adjusted the wording to make it fit the style of the rest of the chapter.

I entered the following prompt: The development of computers as a new, prevalent form of technology has majorly impacted society as a whole in the United States, as they are involved in several aspects of everyday life. Computers are used in some form in most classrooms in the country, in most workplaces, and in most households. Specifically, 92% of households in the U.S. Census Bureau’s American Community Survey reported having at least one type of computer. This is a simple statistic that shows the monumental impact of computers in Americans’ lives.

Use: I used the output the expand upon this paragraph; after entering the prompt, ChatGPT added several sentences and reworded some of the previously written content. I then removed some of the added information from ChatGPT and made the output more concise.

I entered the following prompt: Give me 3 more sentences to add to the following prompt. I am trying to talk about the history of computers and how they were invented. The prompt begins now- Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds.

After getting this output, I wrote this prompt: Give me two more sentences to add to that.

Use: I used the 5 sentences of the output to select the information that I wanted and add it to the content that was already in the book in order to provide more detail to the reduction in sizes of computers over time.

To the extent possible under law, Chandler Little and Ben Greene have waived all copyright and related or neighboring rights to Science Technology and Society a Student Led Exploration , except where otherwise noted.

Share This Book

essay about invention of computer

  • History Classics
  • Your Profile
  • Find History on Facebook (Opens in a new window)
  • Find History on Twitter (Opens in a new window)
  • Find History on YouTube (Opens in a new window)
  • Find History on Instagram (Opens in a new window)
  • Find History on TikTok (Opens in a new window)
  • This Day In History
  • History Podcasts
  • History Vault

Invention of the PC

By: History.com Editors

Updated: March 28, 2023 | Original: May 11, 2011

black pc keyboard, keyboard is very useful tool for personal computer, it is necessary to write words

Today’s personal computers are drastically different from the massive, hulking machines that emerged out of World War II—and the difference isn’t only in their size. By the 1970s, technology had evolved to the point that individuals—mostly hobbyists and electronics buffs—could purchase unassembled PCs or “microcomputers” and program them for fun, but these early PCs could not perform many of the useful tasks that today’s computers can. Users could do mathematical calculations and play simple games, but most of the machines’ appeal lay in their novelty. Today, hundreds of companies sell personal computers, accessories and sophisticated software and games, and PCs are used for a wide range of functions from basic word processing to editing photos to managing budgets. At home and at work, we use our PCs to do almost everything. It is nearly impossible to imagine modern life without them.

Invention of the PC: The Computer Age

The earliest electronic computers were not “personal” in any way: They were enormous and hugely expensive, and they required a team of engineers and other specialists to keep them running. One of the first and most famous of these, the Electronic Numerical Integrator Analyzer and Computer (ENIAC), was built at the University of Pennsylvania to do ballistics calculations for the U.S. military during World War II . ENIAC cost $500,000, weighed 30 tons and took up nearly 2,000 square feet of floor space. On the outside, ENIAC was covered in a tangle of cables, hundreds of blinking lights and nearly 6,000 mechanical switches that its operators used to tell it what to do. On the inside, almost 18,000 vacuum tubes carried electrical signals from one part of the machine to another.

Did you know? Time magazine named the personal computer its 1982 "Man of the Year."

Invention of the PC: Postwar Innovations

ENIAC and other early computers proved to many universities and corporations that the machines were worth the tremendous investment of money, space and manpower they demanded. (For example, ENIAC could solve in 30 seconds a missile-trajectory problem that could take a team of human “computers” 12 hours to complete.) At the same time, new technologies were making it possible to build computers that were smaller and more streamlined. In 1948, Bell Labs introduced the transistor, an electronic device that carried and amplified electrical current but was much smaller than the cumbersome vacuum tube. Ten years later, scientists at Texas Instruments and Fairchild Semiconductor came up with the integrated circuit, an invention that incorporated all of the computer’s electrical parts–transistors, capacitors, resistors and diodes–into a single silicon chip.

But one of the most significant inventions that paved the way for the PC revolution was the microprocessor. Before microprocessors were invented, computers needed a separate integrated-circuit chip for each one of their functions. (This was one reason the machines were still so large.) Microprocessors were the size of a thumbnail, and they could do things the integrated-circuit chips could not: They could run the computer’s programs, remember information and manage data all by themselves.

The first microprocessor on the market was developed in 1971 by an engineer at Intel named Ted Hoff. (Intel was located in California’s Santa Clara Valley, a place nicknamed “Silicon Valley” because of all the high-tech companies clustered around the Stanford Industrial Park there.) Intel’s first microprocessor, a 1/16-by-1/8-inch chip called the 4004, had the same computing power as the massive ENIAC.

The Invention of the PC

These innovations made it cheaper and easier to manufacture computers than ever before. As a result, the small, relatively inexpensive “microcomputer”–soon known as the “personal computer”–was born. In 1974, for instance, a company called Micro Instrumentation and Telemetry Systems (MITS) introduced a mail-order build-it-yourself computer kit called the Altair. Compared to earlier microcomputers, the Altair was a huge success: Thousands of people bought the $400 kit. However, it really did not do much. It had no keyboard and no screen, and its output was just a bank of flashing lights. Users input data by flipping toggle switches.

In 1975, MITS hired a pair of Harvard students named Paul G. Allen and Bill Gates to adapt the BASIC programming language for the Altair. The software made the computer easier to use, and it was a hit. In April 1975 the two young programmers took the money they made from “Altair BASIC” and formed a company of their own—Microsoft—that soon became an empire.

The year after Gates and Allen started Microsoft, two engineers in the Homebrew Computer Club in Silicon Valley named Steve Jobs and Stephen Wozniak built a homemade computer that would likewise change the world. This computer, called the Apple I, was more sophisticated than the Altair: It had more memory, a cheaper microprocessor and a monitor with a screen. In April 1977, Jobs and Wozniak introduced the Apple II, which had a keyboard and a color screen. Also, users could store their data on an external cassette tape. (Apple soon swapped those tapes for floppy disks.) To make the Apple II as useful as possible, the company encouraged programmers to create “applications” for it. For example, a spreadsheet program called VisiCalc made Apple a practical tool for all kinds of people (and businesses)–not just hobbyists.

The PC Revolution

The PC revolution had begun. Soon companies like Xerox, Tandy, Commodore and IBM entered the market, and computers became ubiquitous in offices and eventually homes. Innovations like the “Graphical User Interface,” which allows users to select icons on the computer screen instead of writing complicated commands, and the computer mouse made PCs even more convenient and user-friendly. Today, laptops, smartphones and tablet computers allow us to have a PC with us wherever we go.

essay about invention of computer

Sign up for Inside History

Get HISTORY’s most fascinating stories delivered to your inbox three times a week.

By submitting your information, you agree to receive emails from HISTORY and A+E Networks. You can opt out at any time. You must be 16 years or older and a resident of the United States.

More details : Privacy Notice | Terms of Use | Contact Us

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy Williamson

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

'World's purest silicon' could lead to 1st million-qubit quantum computing chips

New algorithm slashes time to run most sophisticated climate models by 10-fold

Sarcastic fringehead: The angry little fish that engages in mouth-to-mouth combat

Most Popular

  • 2 'Hostilities began in an extremely violent way': How chimp wars taught us murder and cruelty aren't just human traits
  • 3 EV batteries could last much longer thanks to new capacitor with 19-times higher energy density that scientists created by mistake
  • 4 2,500-year-old Illyrian helmet found in burial mound likely caused 'awe in the enemy'
  • 5 Record-shattering Tonga volcanic eruption wasn't triggered by what we thought, new study suggests
  • 2 30,000 years of history reveals that hard times boost human societies' resilience
  • 3 New invention transforms any smartphone or TV display into a holographic projector
  • 4 Roman-era skeletons buried in embrace, on top of a horse, weren't lovers, DNA analysis shows
  • 5 Gargantuan sunspot 15-Earths wide shoots powerful X-class flare toward Earth, triggering radio blackouts

essay about invention of computer

How Computers Affect Our Lives Essay

How computers affect our lives: essay introduction.

Computers are a common phenomenon in the lives of people in today’s world. Computers are very vital especially to those people who run businesses, industries and other organizations. Today, almost everything that people engage in makes use of a computer. Take for instance, the transport sector: vehicles, trains, airplanes, and even traffic lights on our roads are controlled by computers.

In hospitals, most of the equipments use or are run by computers. Look at space exploration; it was all made possible with the advent of computer technology. In the job sector, many of the jobs require knowledge in computers because they mostly involve the use of computers.

In short, these machines have become so important and embedded in the lives of humans, they have hugely impacted on the whole society to the extent that it will be very hard to survive now, without them. This article discusses the influence of computers on the everyday life of human beings.

One can guess what will exactly happen if the world had no computers. Many of the cures found with help of computer technology would not have been developed without computer technology, meaning that many people would have died from diseases that are now curable. In the entertainment industry, many of the movies and even songs will not be in use without computers because most of the graphics used and the animations we see are only possible with the help of a computer (Saimo 1).

In the field of medicine, pharmacies, will find it hard in determining the type of medication to give to the many patients. Computers have also played a role in the development of democracy in the world. Today votes are counted using computers and this has greatly reduced incidences of vote rigging and consequently reduced conflicts that would otherwise arise from the same.

And as we have already seen, no one would have known anything about space because space explorations become possible only with the help of computer technology. However, the use of computers has generated public discourses whereby people have emerged with different views, some supporting their use and others criticizing them (Saimo 1).

History of computers

To better understand how computers influence the lives of people, we will have to start from the history, from their invention to the present day. Early computers did not involve complex technologies as the ones that are used today; neither did they employ the use of monitors or chips that are common today.

The early computers were not that small as those used today and they were commonly used to help in working out complex calculations in mathematics that proved tedious to be done manually. This is why the first machine was called by some as a calculator and others as a computer because it was used for making calculations.

Blaise Pascal is credited with the first digital machine that could add and subtract. Many versions of calculators and computers borrowed from his ideas. And as time went by, many developed more needs, which lead to modifications to bring about new and more efficient computers (Edwards 4).

Positive Effects of Computer on Human Life

Computer influence in the life of man became widely felt during World War II where computers were used to calculate and track the movements and also strategize the way military attacks were done (Edwards 4). It is therefore clear, that computers and its influence on man have a long history.

Its invention involved hard work dedication and determination, and in the end it paid off. The world was and is still being changed by computers. Man has been able to see into the future and plan ahead because of computers. Life today has been made easier with the help of computers, although some people may disagree with this, but am sure many will agree with me.

Those who disagree say that computers have taken away the role of man, which is not wrong at all, but we must also acknowledge the fact what was seen as impossible initially, become possible because of computers (Turkle 22).

As we mentioned in the introduction, computers are useful in the running of the affairs of many companies today. Companies nowadays use a lot of data that can only be securely stored with the help of computers. This data is then used in operations that are computer run. Without computers companies will find it difficult store thousands of records that are made on a daily basis.

Take for instance, what will happen to a customer checking his or her balance, or one who just want to have information on transactions made. In such a case, it will take long to go through all the transactions to get a particular one.

The invention of computers made this easier; bank employees today give customers their balances, transaction information, and other services just by tapping the computer keyboard. This would not be possible without computers (Saimo 1).

In personal life

Today individuals can store all information be it personal or that of a business nature in a computer. It is even made better by being able to make frequent updates and modifications to the information. This same information can be easily retrieved whenever it is needed by sending it via email or by printing it.

All this have been made possible with the use of computers. Life is easier and enjoyable, individuals now can comfortably entertain themselves at home by watching TV with their families or they can work from the comfort of their home thanks to computer technology.

Computers feature in the everyday life of people. Today one can use a computer even without being aware of it: people use their credit cards when buying items from stores; this has become a common practice that few know that the transaction is processed through computer technology.

It is the computer which process customer information that is fed to it through the credit card, it detects the transaction, and it then pays the bill by subtracting the amount from the credit card. Getting cash has also been made easier and faster, an individual simply walks to an ATM machine to withdraw any amount of cash he requires. ATM machines operate using computer technology (Saimo 1).

I mentioned the use of credit cards as one of the practical benefits of using computers. Today, individual do not need to physically visit shopping stores to buy items. All one needs is to be connected on the internet and by using a computer one can pay for items using the credit card.

These can then be delivered at the door step. The era where people used to queue in crowded stores to buy items, or wasting time in line waiting to buy tickets is over. Today, travelers can buy tickets and make travel arrangements via the internet at any time thanks to the advent of computer technology (Saimo 1).

In communication

Through the computer, man now has the most effective means of communication. The internet has made the world a global village. Today people carry with them phones, which are basically small computers, others carry laptops, all these have made the internet most effective and affordable medium of communication for people to contact their friends, their families, contact business people, from anywhere in the world.

Businesses are using computer technology to keep records and track their accounts and the flow of money (Lee 1). In the area of entertainment, computers have not been left behind either.

Action and science fiction movies use computers to incorporated visual effects that make them look real. Computer games, a common entertainer especially to teenagers, have been made more entertaining with the use of advanced computer technology (Frisicaro et.al 1).

In Education

The education sector has also been greatly influenced by computer technology. Much of the school work is done with the aid of a computer. If students are given assignments all they have to do is search for the solution on the internet using Google. The assignments can then be neatly presented thanks to computer software that is made specifically for such purposes.

Today most high schools have made it mandatory for students to type out their work before presenting it for marking. This is made possible through computers. Teachers have also found computer technology very useful as they can use it to track student performance. They use computers to give out instructions.

Computers have also made online learning possible. Today teachers and students do not need to be physically present in class in order to be taught. Online teaching has allowed students to attend class from any place at any time without any inconveniences (Computers 1).

In the medical sector

Another very crucial sector in the life of man that computers has greatly influenced and continues to influence is the health sector. It was already mentioned in the introduction that hospitals and pharmacies employ the use of computers in serving people.

Computers are used in pharmacies to help pharmacists determine what type and amount of medication patients should get. Patient data and their health progress are recorded using computers in many hospitals. The issue of equipment status and placement in hospitals is recorded and tracked down using computers.

Research done by scientists, doctors, and many other people in the search to find cures for many diseases and medical complications is facilitated through computer technology. Many of the diseases that were known to be dangerous such as malaria are now treatable thanks to computer interventions (Parkin 615).

Computers replacing man

Many of the opponents of computer technology have argued against the use of computers basing their arguments on the fact that computers are replacing man when carrying out the basic activities that are naturally human in nature.

However, it should be noted that there are situations that call for extraordinary interventions. In many industries, machines have replaced human labor. Use of machines is usually very cheap when compared to human labor.

In addition machines give consistent results in terms of quality. There are other instances where the skills needed to perform a certain task are too high for an ordinary person to do. This is usually experienced in cases of surgery where man’s intervention alone is not sufficient. However, machines that are computer operated have made complex surgeries successful.

There are also cases where the tasks that are to be performed may be too dangerous for a normal human being. Such situations have been experienced during disasters such as people being trapped underground during mining. It is usually dangerous to use people in such situations, and even where people are used, the rescue is usually delayed.

Robotic machines that are computer operated have always helped in such situations and people have been saved. It is not also possible to send people in space duration space explorations, but computer machines such as robots have been effectively used to make exploration outside our world (Gupta 1).

Negative Computer Influences

Despite all these good things that computers have done to humans, their opponents also have some vital points that should not just be ignored. There are many things that computers do leaving many people wondering whether they are really helping the society, or they are just being used to deprive man his God given ability to function according to societal ethics.

Take for instance in the workplace and even at home; computers have permeated in every activity done by an individual thereby compromising personal privacy. Computers have been used to expose people to unauthorized access to personal information. There is some personal information, which if exposed can impact negatively to someone’s life.

Today the world does not care about ethics to the extent that it is very difficulty for one to clearly differentiate between what is and is not authentic or trustful. Computers have taken up every aspect of human life, from house chores in the home to practices carried out in the social spheres.

This has seen people lose their human element to machines. Industries and organizations have replaced human labor for the cheap and more effective machine labor. This means that people have lost jobs thanks to the advances made in the computer technology. Children using computers grow up with difficulties of differentiating between reality and fiction (Subrahmanyam et.al 139).

People depend on computers to do tasks. Students generate solutions to assignments using computers; teachers on the other hand use computers to mark assignments. Doctors in hospitals depend on machines to make patient diagnoses, to perform surgeries and to determine type of medications (Daley 56).

In the entertainment industry, computer technology has been used to modify sound to make people think that person singing is indeed great, but the truth of the matter is that it is simply the computer. This has taken away the really function of a musician in the music sector.

In the world of technology today, we live as a worried lot. The issue of hacking is very common and even statistics confirm that huge amounts of money are lost every year through hacking. Therefore, as much as people pride themselves that they are computer literate, they deeply worried that they may be the next victim to practices such as hacking (Bynum 1).

Conflict with religious beliefs

There is also the problem of trying to imitate God. It is believed that in 20 years time, man will come up with another form of life, a man made being. This will not only affect how man will be viewed in terms of his intelligence, but it will also break the long held view that God is the sole provider of life.

Computers have made it possible to create artificial intelligence where machines are given artificial intelligence so that they can behave and act like man. This when viewed from the religious point of view creates conflicts in human beliefs.

It has been long held that man was created in the image of God. Creating a machine in the image of money will distort the way people conceive of God. Using artificial methods to come up with new forms of life with man like intelligence will make man equate himself to God.

This carries the risk of changing the beliefs that mankind has held for millions of years. If this happens, the very computer technology will help by the use of mass media to distribute and convince people to change their beliefs and conceptions of God (Krasnogor 1).

Conclusion: How Computer Influences Our Life

We have seen that computer have and will continue to influence our lives. The advent of the computers has changed man as much as it has the world he lives in.

It is true that many of the things that seemed impossible have been made possible with computer technology. Medical technologies have led to discoveries in medicine, which have in turn saved many lives. Communication is now easy and fast. The world has been transformed into a virtual village.

Computers have made education accessible to all. In the entertainment sector, people are more satisfied. Crime surveillance is better and effective. However, we should be ware not to imitate God. As much as computers have positively influenced our lives, it is a live bomb that is waiting to explode.

We should tread carefully not to be overwhelmed by its sophistication (Computers 1). Many technologies have come with intensities that have seen them surpass their productivity levels thereby destroying themselves in the process. This seems like one such technology.

Works Cited

Bynum, Terrell. Computer and Information Ethics . Plato, 2008. Web.

Computers. Institutional Impacts . Virtual Communities in a Capitalist World, n.d. Web.

Daley, Bill. Computers Are Your Future: Introductory. New York: Prentice, 2007. Print.

Edwards, Paul. From “Impact” to Social Process . Computers in Society and Culture,1994. Web.

Frisicaro et.al. So What’s the Problem? The Impact of Computers, 2011. Web.

Gupta, Satyandra. We, robot: What real-life machines can and can’t do . Science News, 2011. Web.

Krasnogor, Ren. Advances in Artificial Life. Impacts on Human Life. n.d. Web.

Lee, Konsbruck. Impacts of Information Technology on Society in the new Century . Zurich. Web.

Parkin, Andrew. Computers in clinical practice . Applying experience from child psychiatry. 2004. Web.

Saimo. The impact of computer technology in Affect human life . Impact of Computer, 2010. Web.

Subrahmanyam et al. The Impact of Home Computer Use on Children’s Activities and Development. Princeton, 2004. Web.

Turkle, Sherry. The second self : Computers and the human spirit, 2005. Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2023, August 21). How Computers Affect Our Lives. https://ivypanda.com/essays/how-computers-influence-our-life/

"How Computers Affect Our Lives." IvyPanda , 21 Aug. 2023, ivypanda.com/essays/how-computers-influence-our-life/.

IvyPanda . (2023) 'How Computers Affect Our Lives'. 21 August.

IvyPanda . 2023. "How Computers Affect Our Lives." August 21, 2023. https://ivypanda.com/essays/how-computers-influence-our-life/.

1. IvyPanda . "How Computers Affect Our Lives." August 21, 2023. https://ivypanda.com/essays/how-computers-influence-our-life/.

Bibliography

IvyPanda . "How Computers Affect Our Lives." August 21, 2023. https://ivypanda.com/essays/how-computers-influence-our-life/.

  • Credit Card: Buy Now and Pay More Later
  • Should College Students Have Credit Cards
  • Marketing. Credit Card Usage Profiles of Teachers
  • Concept and Types of the Computer Networks
  • History of the Networking Technology
  • Bellevue Mine Explosion, Crowsnest Pass, Alberta, December 9, 1910
  • Men are Responsible for More Car Accidents Compared to Women
  • Solutions to Computer Viruses

Ideas That Created the Future: Classic Papers of Computer Science

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Write my essay for me frequently asked questions

essay about invention of computer

We do not tolerate any form of plagiarism and use modern software to detect any form of it

Write an essay from varied domains with us!

Customer Reviews

IMAGES

  1. Computer as The Best Invention of 20th Century Free Essay Example

    essay about invention of computer

  2. Invention Of Computer Essay

    essay about invention of computer

  3. Invention Of Computer Essay

    essay about invention of computer

  4. The Computer Essay Introduction

    essay about invention of computer

  5. History Of Computer In English Essay

    essay about invention of computer

  6. Computer

    essay about invention of computer

VIDEO

  1. 😥👍😢😭🥺 essay invention subscribe and like

  2. 5 inventions that changed the world

  3. Necessity is the mother of invention essay in english writing|| Essay in english writing

  4. essay invention jo aapki Jaan kar sakte hain#like #and #subscribe

  5. Essay Writing : Uses And Benefits of Latest Inventions

  6. Essay Writing

COMMENTS

  1. Computer

    computer, device for processing, storing, and displaying information.. Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery.The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing.

  2. The First Computer: Technology that Changed the World

    The advent of the digital computer age is a murky affair, with different groups accrediting different machines with the accolade of being the very first 'digital computer.' There are three prime candidates that take the podium on this: the Atanasoff-Berry Computer, the Zuse series, and the Electronic Numerical Integrator and Computer, or ENIAC.

  3. Computer

    The first computer. By the second decade of the 19th century, a number of ideas necessary for the invention of the computer were in the air. First, the potential benefits to science and industry of being able to automate routine calculations were appreciated, as they had not been a century earlier. Specific methods to make automated calculation more practical, such as doing multiplication by ...

  4. Who Invented the First Computer?

    The first computer that resembled the modern machines we see today was invented by Charles Babbage between 1833 and 1871. He developed a device, the analytical engine, and worked on it for nearly 40 years. It was a mechanical computer that was powerful enough to perform simple calculations.

  5. Computers: The History of Invention and Development Essay

    The first computers occupied almost the whole room and were very slow in processing data and performance in general. The modern world witnesses the development of computer technologies daily with computers turning into tiny machines and working unbelievably smoothly. A computer is now trusted as a best friend and advisor.

  6. The Modern History of Computing

    In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]).

  7. Computers

    MITS co-founder Ed Roberts invented the Altair 8800 — which sold for $297, or $395 with a case — and coined the term "personal computer". The machine came with 256 bytes of memory (expandable to 64 KB) and an open 100-line bus structure that evolved into the "S-100" standard widely used in hobbyist and personal computers of this era.

  8. History of computing

    In his Essays on Automatics (1914) Torres presented the design of an electromechanical calculating machine and introduced the idea of Floating-point arithmetic. ... The Z3 computer, built by German inventor Konrad Zuse in 1941, was the first programmable, ...

  9. History of computers

    C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations ...

  10. The history of computing is both evolution and revolution

    Six decades on, our series Computing turns 60 looks at how things have changed. It is a truism that computing continues to change our world. It shapes how objects are designed, what information we ...

  11. History of Computers

    This first computer was created in 1822 by Charles Babbage and was created with a series of vacuum tubes, ultimately weighing 700 pounds- far larger than computers today. Computers have vastly decreased in size due to the invention of the transistor in 1947, which revolutionized computing by replacing bulky vacuum tubes with.

  12. The History of Computers: An Essay

    Jack Kilby, an engineer of Texas Instruments developed the IC in 1957 and by 1964 it was in use. The development of the IC was significant because it pooled the role of the transistors and other circuits into a chip. These new IC chips were one sixteenth of a square inch and a few hundredths of an inch thick.

  13. Charles Babbage

    Charles Babbage (born December 26, 1791, London, England—died October 18, 1871, London) was an English mathematician and inventor who is credited with having conceived the first automatic digital computer. Charles Babbage. In 1812 Babbage helped found the Analytical Society, whose object was to introduce developments from the European ...

  14. Invention of the PC

    The Invention of the PC. These innovations made it cheaper and easier to manufacture computers than ever before. As a result, the small, relatively inexpensive "microcomputer"-soon known as ...

  15. History of computers: A brief timeline

    1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book "A Brief History of Computing" (Springer, 2021 ...

  16. Computer

    A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations ... he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, ... In his work Essays on Automatics published in 1914, ...

  17. How Computers Influence Our Life

    It is therefore clear, that computers and its influence on man have a long history. Its invention involved hard work dedication and determination, and in the end it paid off. The world was and is still being changed by computers. Man has been able to see into the future and plan ahead because of computers.

  18. Invention Of Computers Essay

    Invention Of Computers Essay. 1207 Words5 Pages. The Invention of Computers. Inventions that changed the world. Mohammed Abdulhameed - H00258404. CMG. LSC 2103 - Academic Reading and Writing 2. Mr. Napoleon Mannering. Word Count: 1179.

  19. Ideas That Created the Future: Classic Papers of Computer Science

    Later papers document the "Cambrian era" of 1950s computer design, Maurice Wilkes's invention of microcode, Grace Hopper's vision of a computer's "education," Ivan Sutherland's invention of computer graphics at MIT, Whitfield Diffie and Martin Hellman's pioneering work on encryption, and much more.

  20. Computer Inventions Essay

    The computer is one of the most amazing inventions ever created in history to influence life. This vital device has benefited society and comes from a deep and significant. Free Essay: Inventions affect every aspect of our lives. These inventions have impacted and changed the way we work, manage, and live.

  21. Essay About Invention Of Computer

    Essay About Invention Of Computer. 100% Success rate. Max Price. Any. The shortest time frame in which our writers can complete your order is 6 hours. Length and the complexity of your "write my essay" order are determining factors. If you have a lengthy task, place your order in advance + you get a discount! Make the required payment.

  22. Essay Invention Of Computer

    Your Price: .40 per page. 4.8/5. ID 21067. offers a great selection of professional essay writing services. Take advantage of original, plagiarism-free essay writing. Also, separate editing and proofreading services are available, designed for those students who did an essay and seek professional help with polishing it to perfection.