- Computers & Software
What Was the Earliest Computer? A Short History of Computational Devices
When Was The First Computer Invented?
Can you imagine life without computers? I'm writing this article with one right now! We're surrounded by them every day, and most of the automated tasks are guided by computation systems of one kind or another. From your home's heating system to your car's engine management to your cell phone, computers are a fact of life? So what was the first computer, and what did it look like?
The first computer ever made is a tricky thing to reckon, mostly because there are varying descriptions of what a computer actually is, and when the first breakthrough was made. Because of that, I want to offer a few takes on this question.
This article takes a look at several computing milestones, and we'll look at several different interpretations of this theme. If I lose you at any point, please feel free to post a comment and I'll clarify. What was the first computer, and when was it invented? Keep on reading and find out for yourself!
Definition: What IS a Computer?
If we're going to look at the first computer in the world, we should probably start by identifying what a computer is, and how it can be classified as such.
If you were to go back in time and use the word 'computer' in a conversation, you wouldn't get strange looks, believe it or not. That's because a 'computer' was simply someone who performs mathematical calculations. So anyone who worked out complex figures would be a 'computer' by that definition.
However, the modern definition (and probably the one you're used to) is that a computer is a device which performs those computations for us. You enter in a particular set of data, and a problem, and the computer outputs a solution. That's the definition I will be going with!
Difference Engine, by Charles Babbage
The Difference Engine can be considered one of the earliest computers, despite the fact that it's a mechanical computer and not run by electricity.
Before the use of machines, computations had to be done by hand, and errors were fairly common (people make mistakes after all). Charles Babbage wanted to create a machine which would automate the process, speed things up and reduce the number of errors.
It was a mechanical computing device that was run by a hand crank rather than by electricity (which wasn't an option at the time).
Funded by the British government at the time, Babbage was unable to complete the Difference Engine as planned. He burned through £17,000 before the project was cancelled. Nevertheless, Babbage ended up designing his more complex Analytical Engine, which was capable of tackling complex arithmetic using punch cards to program the engine.
Tragically, Babbage wasn't ever able to complete any of his computer designs, but they were later proven, when working models of his machines were built by the Science Museum in London.
The Analytical Engine was proven to be capable of conditional logic and looping, which meant the machine had powerful applications.
It was deemed impractical to build by the British Association for the Advancement of Science, and as such Babbage was never able to see a working Difference Engine or Analytical Engine in his lifetime.
The implications are huge. By not pursuing it, the world was deprived of powerful computers for around 100 years. Can you imagine where we'd be today if it had been completed?
What Was the First Digital Computer? ENIAC vs ABC
Mechanical, crank driven computers are fine and dandy, but the world was really turned on its head with the advent of the electronic, digital computer. There has been some debate as to which was the first digital computer, mostly because there were two World War II era machines vying for the title. So what was the first computer with fully digital functionality?
ENIAC (standing for Electronic Numerical Integrator And Computer... no wonder why they use the short form!) was a computer developed during the latter half of WWII, and it was created for the purpose of (you guessed it) improved warfare and tactics.
It was designed originally to aid in the calculation of artillery fire (obviously a pressing concern at the time due to the conflict), and later was used to help test the hydrogen bomb. It was funded by the US Army, and was one of the very first fully digital computers (there had been a few examples of electro-mechanical computers before that, such as the Z3 by Konrad Zuse, but that was considered a hybrid instead of a pure digital example).
The ENIAC was absolutely massive, consisting of over 17,000 vacuum tubes and 10,000 capacitors. It was the size of a large house at 1,800 square feet, and it consumed a huge amount of electricity to operate.
It received its input via an IBM punch card system, and at the time it was many thousands of times faster than current mechanical computational devices. Another thing that stood out was the fact that ENIAC was fully programmable, capable of loops and conditional logic.
An interesting fact about ENIAC is that most of the programmers were women.
The Atanasoff-Berry Computer, or ABC as it's more commonly referred to these days, is considered to be the first fully digital computer in the world, beating out the ENIAC. It was a fairly low profile computer and as such it wasn't a contender for the title until it was uncovered in the 1960s.
Compared to the ENIAC, the ABC computer was a much smaller and more simple device. It wasn't programmable, and it was only capable of performing linear equations, whereas the ENIAC was a general purpose machine.
That being said, it was fully digital and the first machine to not utilize a hybrid of mechanical and electronic features. It featured about 300 vacuum tubes, and it would require an operator to help set up any equation, being unable to fully run through without assistance.
It was a machine developed to aid in the field of physics, handling algebraic equations simultaneously. It could handle up to 29 problems at once, which made it valuable at the time. However, it was unreliable and prone to error, so it was eventually discontinued.
What Was The First Personal Computer?
In the 1970s and 1980s, computers gradually became available for personal use, the beginning of what is often referred to as the "microcomputer revolution". Prior to that time, computers were functional devices used in research or by large corporations. They were never 'personal' or 'single user' devices.
So what did the first personal computer look like? There are a few I want to cover here.
Because people were leery of computers at the time, the Programma 101 was actually marketed as a 'portable calculator', but it had the features associated with a computer and many people consider it to be among the first personal computers in the world.
The Programma was capable of standard arithmetic, along with square root functions and fractions. It included a printer and paper roller that could print functions and results.
It was fully programmable, and it was innovative because you could actually record programs onto magnetic plastic cards. You could plug these cards back in and run the program again if you needed to. This was a data recording solution that was mimicked later on.
The price tag? A mere $3,200!
IBM 5100 Portable Microcomputer
The definition of 'portable' in the computer world has changed, but considering that many computers took up whole rooms at the time, this was an extremely small unit when it came out.
It was one of the first computers to make use of a CRT display (or computer screen), and it featured 16-bit processing, 64 kb of random access memory (RAM), and it weighed in at a spritely 55 pounds!
It came with a keyboard and cost just shy of $9,000 to start.
It was programmable in either ALP or BASIC languages, switchable using a physical toggle on the front panel.
Microprocessors and Beyond
The First Computers in the World: Big, Clunky and Amazing
It's easy to look at these early computing devices and chuckle, but you have to understand that these were incredible machines when they came out, and their creators and programmers were pioneers that have brought about the world we live in today.
The invention of the microprocessor in the 1970s really set off the personal computer revolution. What formerly required a housing unit could now be compressed into a single chip, and computers have gotten smaller and more powerful as years go by, following Moore's Law to a tee.
The next time you tap away on your cell phone, take a look at the personal computer in your hand (thousands of times more powerful than the house-sized ENIAC), and thing how far we've come!
Thanks for reading!