Though Silicon Valley may be the heart of the commercialisation of all things digital, it is the British who can proudly boast having invented the computer. Indeed, so proud are the British of the work done by eccentric British mathematician and inventor Charles Babbage, that the Science Museum in London has subsequently built the machines he conceived and the Royal College of Surgeons has preserved his brain - the brain invented the computer.
In Babbage's time, mathematical tables, such as logarithmic and trigonometric functions, were generated by teams of mathematicians. Babbage (1791-1871) became frustrated with the many mistakes in these tables which were produced for navigation, engineering, banking and insurance and dreamed of removing the human element from these calculations. He proposed the first computer, a machine he called "the difference engine", in 1822 - it was the size of a house, could store a program, was powered by steam and could even print results.
He worked on the Difference Engine for a decade, before his work opened his mind to the possibilities and he began work on the first general-purpose computer, which he called the Analytical Engine.
Babbage's equally celebrated assistant was Augusta Ada King, the Countess of Lovelace (1815-1842) and the daughter of English poet Lord Byron. Ada was perhaps the only person who understood Babbage's work at a detailed level.
As his assistant, she performed many of the functions of recognisable modern-day counterparts - she sought and received funding and grants from the British Government on several occasions (VC raising), she communicated the benefits of the machines to the public (PR) and she created instruction routines for the machine (computer programmer). Such was her influence, that the programming language ADA was named in her honour.Babbage's steam-powered Analytical Engine was primitive by today's standards, though it outlined the basic elements of a modern general purpose computer and was a breakthrough concept.
Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also contained a "mill" with a control unit that allowed processing instructions in any sequence.
Babbage copied the idea of punch cards to encode machine instructions from the Jacquard loom.Babbage's reputation as a visionary and engineer was vindicated when several of the machines he designed, notably the second Difference Engine and its 2.5 tonne printer, were built by the London Science Museum to commemorate the 200th anniversary of his birth in 1991.
The machines had not been built at the time, mainly due to lack of funds, increasing belief that Babbage was a crackpot. It was subsequently proven that that the critical tolerances required by his machines exceeded the metallurgy and technology available at the time.Built from his original plans, not only did they work, they worked exceptionally well.
The Difference Engine was accurate to 31 decimal places and when the team of engineers finished building a replica of what would have been the world's first computer printer, they were astound-ed at its complexity and feature sets.
Designed to automatically print the computational tables he had dreamed he might produce automatic-ally, the 2.5 tonne dedicated printing press was capable of being set to print different numbers of columns, with adjust-ments for the height between the lines, the space between columns.
There are some wonderful Babbage resources on the internet.
http://www.gizmag.com/go/1288/
What is a Computer?
The machine which is sitting in front of you.
The machine which can draw graphics, set up your modem, decipher your PGP, do typography, refresh your screen, monitor your keyboard, manage the performance of all these in synchrony... and do all of these through a single principle: reading programs placed in its storage.
But the meaning of the word has changed in time. In the 1930s and 1940s "a computer" still meant a person doing calculations. There is a nice historical example of this usage here. So to indicate a machine doing calculations you would say "automatic computer". In the 1960s people still talked about the digital computer as opposed to the analog computer.
But nowadays, I think it is better to reserve the word "computer " for the type of machine which has swept everything else away in its path: the computer on which you are reading this page, the digital computer with "internally stored modifiable program. "
So I wouldn't call Charles Babbage's 1840s Analytical Engine the design for a computer. It didn't incorporate the vital idea which is now exploited by the computer in the modern sense, the idea of storing programs in the same form as data and intermediate working. His machine was designed to store programs on cards, while the working was to be done by mechanical cogs and wheels.
There were other differences — he did not have electronics or even electricity, and he still thought in base-10 arithmetic. But more fundamental is the rigid separation of instructions and data in Babbage's thought.
A hundred years later, in the early 1940s, electromagnetic relays could be used instead of gearwheels. But no-one had advanced on Babbage's principle. Builders of large calculators might put the program on a roll of punched paper rather than cards, but the idea was the same: you built machinery to do arithmetic, and then you arranged for instructions coded in some other form, stored somewhere else, to make the machinery work.
To see how different this is from a computer, think of what happens when you want a new piece of software. You can download it from a remote source, and it is transmitted by the same means as email or any other form of data. You may apply an UnStuffIt or GZip program to it when it arrives, and this means operating on the program you have ordered. For filing, encoding, transmitting, copying, a program is no different from any other kind of data — it is just a sequence of electronic on-or-off states which lives on hard disk or RAM along with everything else.
The people who built big electromechanical calculators in the 1930s and 1940s didn't think of anything like this. I would call their machines near-computers, or pre-computers: they lacked the essential idea.
More on near-computers, war and peace
ENIAC
Even when they turned to electronics, builders of calculators still thought of programs as something quite different from numbers, and stored them in quite a different, inflexible, way. So the ENIAC, started in 1943, was a massive electronic calculating machine, but I would not call it a computer in the modern sense, though some people do. This page shows how it took a square root — incredibly inefficiently.
Colossus
The Colossus was also started in 1943 at Bletchley Park, heart of the British attack on German ciphers (see this Scrapbook page.)
I wouldn't call it a computer either, though some people do: it was a machine specifically for breaking the "Fish" ciphers, although by 1945 the programming had become very sophisticated and flexible.
But the Colossus was crucial in showing Alan Turing the speed and reliability of electronics. It was also ahead of American technology, which only had the comparable ENIAC fully working in 1946, by which time its design was obsolete. (And the Colossus played a part in defeating Nazi Germany by reading Hitler's messages, whilst the ENIAC did nothing in the war effort.)
1996 saw the fiftieth anniversary of the ENIAC. The University of Pennsylvania and the Smithsonian made a great deal of it as the "birth of the Information Age". Vice-President Gore and other dignitaries were involved. Good for them. At Bletchley Park Museum, the Reconstruction of the Colossus had to come from the curator Tony Sale's individual efforts. Americans and Brits do things differently. Some things haven't changed in fifty years.
Zuse's machines
Konrad Zuse, in Germany, quite independently designed mechanical and electromechanical calculators, before and during the war. He didn't use electronics. He still had a program on a paper tape: his machines were still developments of Babbage-like ideas. But he did see the importance of programming and can be credited with a kind of programming language, Plankalkül. It is not difficult to believe that with greater support, he would have gone far ahead with the theory and practice of computing.
Like Turing, Zuse was an isolated innovator. But while Turing was taken by the British government into the heart of the Allied war effort, the German government declined Zuse's offer to help with code-breaking machines.
The parallel between Turing and Zuse is explored by Thomas Goldstrasz and Henrik Pantle.
Their work is influenced by the question: was the computer the offspring of war? They conclude that the war hindered Zuse and in no way helped.
In contrast, there can be no question that Alan Turing's war experience was what made it possible for him to turn his logical ideas into practical electronic machinery. This is a great irony of history which forms the central part of his story. He was the most civilian of people, an Anti-War protester of 1933.
He was very different in character from John von Neumann, who relished association with American military power. But von Neumann was on the winning side in the Second World War, whilst Turing was on the side that scraped through, proud but almost bankrupt.
The Internally Stored Modifiable Program
The breakthrough came through two sources in 1945:
Alan Turing, on the basis