Question:
Who invented the computer? what websites can help?
*Kristy* <3
2008-02-19 07:30:53 UTC
person who writes the most gets 10 points!!!!
Three answers:
BarkersBeauty
2008-02-19 08:16:24 UTC
Though Silicon Valley may be the heart of the commercialisation of all things digital, it is the British who can proudly boast having invented the computer. Indeed, so proud are the British of the work done by eccentric British mathematician and inventor Charles Babbage, that the Science Museum in London has subsequently built the machines he conceived and the Royal College of Surgeons has preserved his brain - the brain invented the computer.



In Babbage's time, mathematical tables, such as logarithmic and trigonometric functions, were generated by teams of mathematicians. Babbage (1791-1871) became frustrated with the many mistakes in these tables which were produced for navigation, engineering, banking and insurance and dreamed of removing the human element from these calculations. He proposed the first computer, a machine he called "the difference engine", in 1822 - it was the size of a house, could store a program, was powered by steam and could even print results.



He worked on the Difference Engine for a decade, before his work opened his mind to the possibilities and he began work on the first general-purpose computer, which he called the Analytical Engine.



Babbage's equally celebrated assistant was Augusta Ada King, the Countess of Lovelace (1815-1842) and the daughter of English poet Lord Byron. Ada was perhaps the only person who understood Babbage's work at a detailed level.



As his assistant, she performed many of the functions of recognisable modern-day counterparts - she sought and received funding and grants from the British Government on several occasions (VC raising), she communicated the benefits of the machines to the public (PR) and she created instruction routines for the machine (computer programmer). Such was her influence, that the programming language ADA was named in her honour.Babbage's steam-powered Analytical Engine was primitive by today's standards, though it outlined the basic elements of a modern general purpose computer and was a breakthrough concept.



Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also contained a "mill" with a control unit that allowed processing instructions in any sequence.



Babbage copied the idea of punch cards to encode machine instructions from the Jacquard loom.Babbage's reputation as a visionary and engineer was vindicated when several of the machines he designed, notably the second Difference Engine and its 2.5 tonne printer, were built by the London Science Museum to commemorate the 200th anniversary of his birth in 1991.



The machines had not been built at the time, mainly due to lack of funds, increasing belief that Babbage was a crackpot. It was subsequently proven that that the critical tolerances required by his machines exceeded the metallurgy and technology available at the time.Built from his original plans, not only did they work, they worked exceptionally well.



The Difference Engine was accurate to 31 decimal places and when the team of engineers finished building a replica of what would have been the world's first computer printer, they were astound-ed at its complexity and feature sets.



Designed to automatically print the computational tables he had dreamed he might produce automatic-ally, the 2.5 tonne dedicated printing press was capable of being set to print different numbers of columns, with adjust-ments for the height between the lines, the space between columns.



There are some wonderful Babbage resources on the internet.

http://www.gizmag.com/go/1288/



What is a Computer?

The machine which is sitting in front of you.

The machine which can draw graphics, set up your modem, decipher your PGP, do typography, refresh your screen, monitor your keyboard, manage the performance of all these in synchrony... and do all of these through a single principle: reading programs placed in its storage.



But the meaning of the word has changed in time. In the 1930s and 1940s "a computer" still meant a person doing calculations. There is a nice historical example of this usage here. So to indicate a machine doing calculations you would say "automatic computer". In the 1960s people still talked about the digital computer as opposed to the analog computer.



But nowadays, I think it is better to reserve the word "computer " for the type of machine which has swept everything else away in its path: the computer on which you are reading this page, the digital computer with "internally stored modifiable program. "



So I wouldn't call Charles Babbage's 1840s Analytical Engine the design for a computer. It didn't incorporate the vital idea which is now exploited by the computer in the modern sense, the idea of storing programs in the same form as data and intermediate working. His machine was designed to store programs on cards, while the working was to be done by mechanical cogs and wheels.



There were other differences — he did not have electronics or even electricity, and he still thought in base-10 arithmetic. But more fundamental is the rigid separation of instructions and data in Babbage's thought.



A hundred years later, in the early 1940s, electromagnetic relays could be used instead of gearwheels. But no-one had advanced on Babbage's principle. Builders of large calculators might put the program on a roll of punched paper rather than cards, but the idea was the same: you built machinery to do arithmetic, and then you arranged for instructions coded in some other form, stored somewhere else, to make the machinery work.



To see how different this is from a computer, think of what happens when you want a new piece of software. You can download it from a remote source, and it is transmitted by the same means as email or any other form of data. You may apply an UnStuffIt or GZip program to it when it arrives, and this means operating on the program you have ordered. For filing, encoding, transmitting, copying, a program is no different from any other kind of data — it is just a sequence of electronic on-or-off states which lives on hard disk or RAM along with everything else.



The people who built big electromechanical calculators in the 1930s and 1940s didn't think of anything like this. I would call their machines near-computers, or pre-computers: they lacked the essential idea.



More on near-computers, war and peace

ENIAC

Even when they turned to electronics, builders of calculators still thought of programs as something quite different from numbers, and stored them in quite a different, inflexible, way. So the ENIAC, started in 1943, was a massive electronic calculating machine, but I would not call it a computer in the modern sense, though some people do. This page shows how it took a square root — incredibly inefficiently.



Colossus

The Colossus was also started in 1943 at Bletchley Park, heart of the British attack on German ciphers (see this Scrapbook page.)

I wouldn't call it a computer either, though some people do: it was a machine specifically for breaking the "Fish" ciphers, although by 1945 the programming had become very sophisticated and flexible.



But the Colossus was crucial in showing Alan Turing the speed and reliability of electronics. It was also ahead of American technology, which only had the comparable ENIAC fully working in 1946, by which time its design was obsolete. (And the Colossus played a part in defeating Nazi Germany by reading Hitler's messages, whilst the ENIAC did nothing in the war effort.)



1996 saw the fiftieth anniversary of the ENIAC. The University of Pennsylvania and the Smithsonian made a great deal of it as the "birth of the Information Age". Vice-President Gore and other dignitaries were involved. Good for them. At Bletchley Park Museum, the Reconstruction of the Colossus had to come from the curator Tony Sale's individual efforts. Americans and Brits do things differently. Some things haven't changed in fifty years.



Zuse's machines

Konrad Zuse, in Germany, quite independently designed mechanical and electromechanical calculators, before and during the war. He didn't use electronics. He still had a program on a paper tape: his machines were still developments of Babbage-like ideas. But he did see the importance of programming and can be credited with a kind of programming language, Plankalkül. It is not difficult to believe that with greater support, he would have gone far ahead with the theory and practice of computing.

Like Turing, Zuse was an isolated innovator. But while Turing was taken by the British government into the heart of the Allied war effort, the German government declined Zuse's offer to help with code-breaking machines.



The parallel between Turing and Zuse is explored by Thomas Goldstrasz and Henrik Pantle.



Their work is influenced by the question: was the computer the offspring of war? They conclude that the war hindered Zuse and in no way helped.



In contrast, there can be no question that Alan Turing's war experience was what made it possible for him to turn his logical ideas into practical electronic machinery. This is a great irony of history which forms the central part of his story. He was the most civilian of people, an Anti-War protester of 1933.



He was very different in character from John von Neumann, who relished association with American military power. But von Neumann was on the winning side in the Second World War, whilst Turing was on the side that scraped through, proud but almost bankrupt.



The Internally Stored Modifiable Program

The breakthrough came through two sources in 1945:

Alan Turing, on the basis
Chrissy B
2008-02-19 11:50:01 UTC
Inventor of the Computer

Many say the first computer is the "difference engine." The first of these devices was conceived in 1782 by J. H. smith. It was never built.



Difference engines were forgotten and then rediscovered in 1822 by Charles Babbage. This machine used the decimal numbers system and was powered by cranking a handle. The British government first financed the project but then later cut off support. Babbage went on to design his much more general analytical engine but later returned and produced an improved design (his "Difference Engine No. 2") between 1834 and 1869.



Others point out that this is the first ELECTRONIC computer. The earliest computer known is the Antikythera Machine, a mechanical device that computed the positions of the astrological signs on any given date, past or future. It was discovered in an ancient shipwreck in the Mediterranean Sea and dates to approximately 250 BC. The designer/builder is not known, but because of its similarity to other mechanical devices known to have been designed by Archimedes, it is probably his work.



Still others will say the abacus is the first computer. They were invented by the Chinese between 2600 BC and 300 BC is considered as the first computer ever. Abacus was used by the merchants and Clerks in China.



Here is still more input:





If you mean Electronic Computer, It was a man called Alan Turing from Cambridge UK, Who was draughted in to Bletchley park secret base where they worked at cracking the WW3 enigma codes that the Germans used every day. The Germans changed their Enigma machines to a four digit code maker. However, Because what went on at Bletchley Park the computer made from thousands of valves was kept top secret up untill recently. The computer, named Colossus was smashed to peices at the end of the war. The buildings have now been restored as a tourist centre.



The first computer, or "modern computer" was invented in World War II by a German engineer, Konrad Zuse in 1941 called the Z3. More Info: "I can add some authenticity to this answer. My grandfather was a rocket scientist on Werner Von Braun's team during WWII. He was the technician who actually built the computer described above. It was an analog computer designed to simulate the guidance system for the rockets. It was built in secret because the higher-ups had not given their permission for this project."



After doing some reseach to answser a question for a scholarship I was applying for I found that Babbage failed to build a complete machine. The most widely accepted reason for this failure is that Victorian mechanical engineering were not sufficiently developed to produce parts with sufficient precision.



It was Konrad Zuse. He invented the z1, z2, z3, z4 and other ones. The z3 was the first fully functional program-controlled electromaechanical digital computer in the world-completed in 1941. Charles Babbage just made a mechanical computing machine.



"Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention.



The first electronic computer was invented by John Vincent Ansoff. He named it the Anatasoff Berry Computer, or the ABC.



Now, if we're talking technical knowledge and actual precursors to the PC - IBM may have accidentally spread it around when they allowed cloning of the PC architecture. But they were not the first.



These are all pre-IBM machines: MITS ALTAIR 8800, Apple II, TRS80, Attari 800 and the Commodore 64.



Purists who claim that the ALTAIR was not the first, will say it was 'Simon' by Berkley Enterprises, 1950, costing $300.



The first completely electronic computer was developed in England in 1943. It was known as Colossus. It took up 1,000 Sq. ft. weighed 30 tons/60,000 pounds. And took 150 kilowatts which is enough power to light up a small town.



The first computer was developed by Charles Babbage. It was called the Differencial and Analytical Engine. The programmer for this computer was Ada Lovelace (first programmer).



English mathematician Charles Babbage (1792–1871) designed a mechanical computing machine called the "analytical engine." It is considered the forerunner of the digital computer, a programmable electronic device that stores, retrieves, and processes data.

While attending Cambridge University in 1812, Babbage conceived of the idea of a machine that could calculate data more rapidly than existing computing methods, and without human error. The Industrial Revolution (a period of technological development; c. 1750–c. 1850) had been underway for more than half a century, and the world was becoming increasingly complex. Human errors in mathematical tables posed serious problems for many rapidly growing industries. After graduating from Cambridge, Babbage returned to the idea of developing a device to facilitate computation. Beginning work in 1834, he spent the rest of his life and much of his.

What is a Computer?

The machine which is sitting in front of you.

The machine which can draw graphics, set up your modem, decipher your PGP, do typography, refresh your screen, monitor your keyboard, manage the performance of all these in synchrony... and do all of these through a single principle: reading programs placed in its storage.



But the meaning of the word has changed in time. In the 1930s and 1940s "a computer" still meant a person doing calculations. There is a nice historical example of this usage here. So to indicate a machine doing calculations you would say "automatic computer". In the 1960s people still talked about the digital computer as opposed to the analog computer.



But nowadays, I think it is better to reserve the word "computer " for the type of machine which has swept everything else away in its path: the computer on which you are reading this page, the digital computer with "internally stored modifiable program. "



The world's computer industries now make billions out of manufacturing better and better versions of Turing's universal machine. But Alan Turing himself never made a point of saying he was first with the idea. And his earnings were always modest. Picture from a Japanese comic book of Turing's story.

So I wouldn't call Charles Babbage's 1840s Analytical Engine the design for a computer. It didn't incorporate the vital idea which is now exploited by the computer in the modern sense, the idea of storing programs in the same form as data and intermediate working. His machine was designed to store programs on cards, while the working was to be done by mechanical cogs and wheels.



There were other differences — he did not have electronics or even electricity, and he still thought in base-10 arithmetic. But more fundamental is the rigid separation of instructions and data in Babbage's thought.



Charles Babbage, 1791-1871

Charles Babbage (Wikipedia)

Virtual Museum of Computing

The completion of the Difference Engine in the Science Museum, London

Home page maintained by his biographer A. Hyman.





A hundred years later, in the early 1940s, electromagnetic relays could be used instead of gearwheels. But no-one had advanced on Babbage's principle. Builders of large calculators might put the program on a roll of punched paper rather than cards, but the idea was the same: you built machinery to do arithmetic, and then you arranged for instructions coded in some other form, stored somewhere else, to make the machinery work.



To see how different this is from a computer, think of what happens when you want a new piece of software. You can download it from a remote source, and it is transmitted by the same means as email or any other form of data. You may apply an UnStuffIt or GZip program to it when it arrives, and this means operating on the program you have ordered. For filing, encoding, transmitting, copying, a program is no different from any other kind of data — it is just a sequence of electronic on-or-off states which lives on hard disk or RAM along with everything else.



The people who built big electromechanical calculators in the 1930s and 1940s didn't think of anything like this. I would call their machines near-computers, or pre-computers: they lacked the essential idea.





More on near-computers, war and peace

ENIAC

Even when they turned to electronics, builders of calculators still thought of programs as something quite different from numbers, and stored them in quite a different, inflexible, way. So the ENIAC, started in 1943, was a massive electronic calculating machine, but I would not call it a computer in the modern sense, though some people do. This page shows how it took a square root — incredibly inefficiently.



Colossus

The Colossus was also started in 1943 at Bletchley Park, heart of the British attack on German ciphers (see this Scrapbook page.)

I wouldn't call it a computer either, though some people do: it was a machine specifically for breaking the "Fish" ciphers, although by 1945 the programming had become very sophisticated and flexible.



But the Colossus was crucial in showing Alan Turing the speed and reliability of electronics. It was also ahead of American technology, which only had the comparable ENIAC fully working in 1946, by which time its design was obsolete. (And the Colossus played a part in defeating Nazi Germany by reading Hitler's messages, whilst the ENIAC did nothing in the war effort.)



1996 saw the fiftieth anniversary of the ENIAC. The University of Pennsylvania and the Smithsonian made a great deal of it as the "birth of the Information Age". Vice-President Gore and oth
anonymous
2016-05-22 04:05:32 UTC
I believe IBM did


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...