History of computers – information technology / unit 1

Let us take a look at the history of the computers that we know today. The very first calculating device used was the ten fingers of a man’s hands. This, in fact, is why today we still count in tens and multiples of tens. Then the abacus was invented, a bead frame in which the beads are moved form left to right. People went on using some form of abacus well into the 16th century, and it is still being used in some parts of the world because it can be understood without knowing how to read.

During the 17th and 18th centuries many people tried to find easy ways of calculating. J. Napier, a Scotsman, devised a mechanical way of multiplying and dividing, which is how the modern slide rule works. Henry Briggs used Napier’s ideas to produce logarithm table which all mathematicians use today. Calculus, another branch of mathematics, was independently invented by both Sir Isaac Newton, an Englishman, and Leibnitz, a German mathematician.

The first real calculating machine appeared in 1820 as a result of several people’s experiments. This type of machine, which saves a great deal of time and reduces the possibility of making mistakes, depends on a series often-toothed gear wheels. In 1830 Charles Babbage, an Englishman, designed a machine that was called “The Analytical Engine”. This machine, which Babbage showed at the Paris Exhibition in 1855, was an attempt to cut out the human being altogether, except for providing the machine with the necessary facts about the problem to be solved. He never finished this work, but many of his ideas were the basis for building today’s computers.

In 1930, the first analog computer was invented by an American named Vannevar Bush. This device was used in World War II to help aim guns. Mark I, the name given to the first digital computer, was completed in 1944. The men responsible for this invention were Professor Howard Aiken and some people from IBM. This was the first machine that

could figure out long lists of mathematical problems, all at a very fast rate. In 1946 two engineers at the University of Pennsylvania built the first digital computer using parts called vacuum tubes. They named their new invention ENIAC. Another important advancement in computers came in 1947, when John von Newman developed the idea of keeping instructions for the computer inside the computer’s memory.

The first generation of computers, which used vacuum tubes, came out in 1950. Univac I is an example of these computers which could perform thousands of calculations per second. In 1960, the second generation of computers was developed and these could perform work ten times faster than their predecessors. The reason for this extra speed was the use of transistors instead of vacuum tubes. Second-generation computers were smaller, faster and more dependable than first-generation computers. The third-generation computers appeared on the market in 1965. These computers could do a million calculations a second, which is 1000 times as many as first-generation computers. Unlike second-generation computers, these are controlled by tiny integrated circuits and are consequently smaller and more dependable. Fourth-generation computers have now arrived, and the integrated circuits that are being developed have been greatly reduced in size. This is due to microminiaturization, which means that the circuit fare much smaller than before; as many as 1000 tiny circuits now fit onto a single chip.


1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)



History of computers – information technology / unit 1