Birth of computers
Birth of computers
History & Birth of Computers
As I promised, here is an article on the history of computers. Now, in my previous article related to computers, I mentioned of the various computers that are present in today’s modern world.
Suprisingly, things were much different 20 years ago if you had wanted to use a computer. Computers these days are more advanced and operating systems are much better and what they do best: “Multi-tasking”.
Well, computers were actually present long before the 20th century. It began in the 17th century and followed through a number of generations before stopping in the fifth generation as it currently is.
There is six generations altogether and I will outline them as below:
- · Generation 0 – Mechanical Calculating Machines
- · Generation 1 – Vaacum tube Computers
- · Generation 2 - Transistors
- · Generation 3 – Integrated Circuits
- · Generation 4 – Very Large Scale Integrated Circuits (VLSI)
- · Generation 5 – the Future
Generation 0 (zero)
Generation zero was from 1600 to 1936. This entire generation consisted of mechanical machines that could do various functions.
This generation began in the early 17th Century where John Napier mechanized the multiplication tables/logarithmic.
The calculating clock was then invented around 1617 by Wilhelm Schikard which marked the beginning of mechanical calculators.
In 1642, Blaise Pascal invented the ‘Pascaline’ an adding machine.
The major breakthrough came from Gottfried Wilhelm von Leibniz who invented the ‘Leibniz Step Reckoner’ which could add, subtract, multiply, divide and square root functions.
von Leibniz was the man who came up with differential calculus, first to study binary arithmetic at any length and developed the more complicated adding machine, the ‘Leibniz Step Reckoner’.
Then in 1804, Joseph-Marie Jacquard invented the Jacquard’s Punch Card Loom which used punch cards to operate the looms.
Now, let me remind you that in the 80’s and predominantly in the 70’s punch cards were very common with computers, even during programming.
Charles Babbage is probably the most common inventors in the late 1800 developing a number of interesting machines:
· Difference Engine
· Analytical Engine
Now the difference engine was developed in 1822 and was used to calculate tables of values of polynomials using the “finite differences” method.
The analytical engine, the more famous of the two machines invented was created in the year 1833. Now this engine was perhaps the core of what is in today’s modern computers.
This engine had three main components
· The mill
An arithmetic processing unit
A type of memory (like the cache memory or main memory)
· Input & Output devices (I/O)
Input via punch cards
I may not have mentioned this, but the primary functions of a computer is to fetch, decode and execute instructions.
Now, this engine is similar to the functions of a modern day computer.
To all women out there, I have to admit that it is a great honour to let you all know that a lady was the first programmer and her name was Ada Lovelace.
Therefore - a round of applause to all women out there for a great achievement in history and in the 1800s.
In the 1890s, Herman Hollerith invented the tabulating machine which was developed for the United States census.
It used punch cards as its main mechanism, and the machine was used to analyse the United States census data.
An interesting milestone here is that Hollerith then founded the Tabulating Machine Company in 1896 and this company merged with others to form IBM in the year 1924.
Generation 1 (One)
The first generation computers were based on vacuum tubes. I have a picture of it here:
In 1936, Alan Turing developed the Turing machine which worked on the principles of moving from one state to another state using a set of rules depending on a symbol it read from the tape. This allowed it to write a symbol on a tape or delete a symbol from a tape.
From 1936-1941, Konrad Zuse developed the Z1-Z3 machines. These were the first general purpose program controlled computer.
The machines were built with mechanical relays for control and memory. The programs were punched into old movie film.
1937 was the launch of ABC (Atanasoff Berry Computer). This was the first completely electronic computer that were built to solve systems of linear equations.
These computers were constructed using vacuum tubes.
Harvard Mark 1 was built in 1943 which was the first electro-mechanical digital computer in the United States.
It was built consisting of electromagnetic relays – magnets which open and close and act as metal switches.
Programming was done on punched paper tape and consisted of 1 million parts taking up 50ft of space in length and 8ft in height.
This machine was capable of 323 decimal-digit additions per second and could store 7323 digit numbers.
ENIAC was built in 1946 and the EDVAC in 1951. These two inventions were implemented by Eckert, Mauchly and Von Neumann.
The ENIAC had 18000 vacuum tubes and took up 1800 square feet, could do 5000 additions per second and had a memory capacity of 1000bits in what is 125 bytes.
The ENIAC was programmed by wire plugs into panels and was fully electronic and programmable. However, it required many wires to get the job done.
The EDVAC instead was an Electronic Discrete Variable Automatic Computer which was fully programmable.
The difference between the ENIAC and EDVAC is that the EDVAC was programmed on binary (base 2) representations whereas the ENIAC used decimals (base 10).
It was also equipped with a CRT display monitor and had programs stored within its memory.
Generation 2 (Two)
The start of Generation 2 was marked with the invention of transistors by William Shockley, John Bardeen and Walter Brattain.
The early DEC computers were equipped with transistors instead of vacuum tubes. Transistors acted as a gate switch which consisted of the on and off methodology, where on represented ‘1’ and off represented ‘0’.
During the second generation, programming moved from machine language to assembly language. It also marked the development of high level languages such as COBOL (1959) and FORTRAN (1954).
Third Generation (Generation 3)
The third generation began with the development of ICs (Integrated Chips) which were electronic circuits on a silicon based chip.
The IBM 360 was one of the first to run on ICs and was the first multi-programmable machines.
This was followed by the DEC PDP 8 in 1965 which sparked the minicomputer revolution.
In 1976 however, Cray-1 was launched bearing the heavy weight title: “Supercomputer”.
The Fourth Generation
The fourth generation marked more advances in the computer industry with the development of Very Large Scale Integrated Circuits (VLSI).
These circuits consisted of more than 10000 components per chip and led to the development of the first microprocessor, the Intel 4004 in 1971.
From then on, manufacturers such as Altair, IBM, Macintosh started developing processors for their range of computers.
That has not stopped until now, of course some of the manufacturers are now gone replaced by much bigger companies such as Intel and AMD.
IBM and Apple continue to have their own brand of computers but not necessarily running on their own processors.
The future is the fifth generation and though we are already in the fifth generation, it is a long way to go.
The fifth generation consists of
- Artificial intelligence
- Quantum computing
- Natural language
Most of the things above are already in research for a long time. However, they are still in their primary stages and it would take great many years to successfully accomplish those things.
It has been a long journey since the 17th century up to the 21st century. 400 years of computation and no wonder it is so important in our lives today.
This is where my article ends for today, and I look forward to post more articles over the next few days! Hopefully you all enjoyed it.
Have a nice day then! :-)
If you are really interested to know more about computers, maybe you should read the following books: