A Short History of Electronic Computers
The history of computers began in ancient times
The electronic, digital, programmable computer has only been around since the 1940s, yet it’s certainly changed our world in a profound way. These days, computers seem to be in just about everything. But when did people begin making them, and where will this technology lead humankind?
Please keep reading and find out!
The Sumerian abacus, a calculating tool used in arithmetic computations, was invented about 2400 B.C.E. and was still useful until the 1940s. The Antikythera mechanism, possibly invented by Archimedes about 100 B.C.E., was used to calculate astronomical positions and may be the world’s first analog computer. And sliderules, invented in the 1620s, were used by astronauts during the Apollo Moon missions.
Since a computer can be anything that computes, in the early decades of the 1900s computers were often people. Sitting in huge rooms, scores of people worked at identical desks and, using mechanical adding machines and plenty of pencil and paper, computed the specifications for aviation technology during World War Two.
The electronic era of computers began in the middle 1940s in the United States and the United Kingdom. The Colossus Mark II computer, built in the U.K. in 1944, was used to break German secret codes during World War Two. The Colossus was the world’s first electronic, digital, programmable computer. Vacuum tubes, all 2,400 of them, were used to make its calculations.
As computer technology tends to work, the ENIAC, produced in 1946, soon trumped Colossus. ENIAC used 18,000 vacuum tubes and was as big as some houses. It was considered the world’s first general-purpose electronic computer. Astonishingly, this marvel stayed useful for the next eight years!
Notable innovations in computerization included production of the Ferranti Mark I (1948), the first commercially produced electronic computer. UNIVAC, produced in the U.S. in 1951, was used to compute the census for 1952. EDSAC (1949), a British computer, was the first to use it own stored programs, utilizing the so-called von Neumann architecture, still used by contemporary computer scientists.
Other innovations that changed computerization included the invention of the transistor (1953), the integrated circuit (1959), the floppy disk (1970), the first microprocessor, Intel’s 4004 (1971) and the Apple 1 personal computer in 1976 (now a collector’s item, by the way).
Enter the Commodore 64
In the early 1980s, the age of personal computers (PCs) gained momentum. People wanted home computers and were willing to pay hundreds of dollars to buy them. Apple PCs were available, but they cost more than a $1,000 apiece. Offering a cheaper alternative, a new PC was marketed in early 1982. Priced at just under $600, it was called the Commodore 64 because it had 64 kilobytes of random access memory or RAM. Believe it or not, that was a big deal in those days!
The Commodore 64 had an 8-bit microprocessor and an operation speed of just over one megahertz. It also had an impressive sound and graphics package and eventually offered as many as 10,000 software titles. Commodore International sold 17 million C64s, more than any PC ever produced! (The model was discontinued in 1994.)
Keeping in mind that a computer is no more advanced than its microprocessor or central processing unit (CPU), let’s continue:
First produced in 1989, Intel’s 486 microprocessor was the first “chip” to use more than one million transistors; it ran at 50 megahertz, had an on-chip SRAM cache and could execute 40 million instructions per second. This was a monster of a microprocessor for its time! At any rate, it was a vast improvement over its predecessor, Intel’s 386.
Intel Pentium Microprocessors
From 1993, when Intel introduced the Pentium 1, the company has continued producing inexpensive yet powerful microprocessors. The Pentium 1 used over three million transistors (more transistors means higher processing performance) and operated at around 100 megahertz. These Pentium microchips have been used in all manner of electronic devices - desktop computers, laptops, cell phones, smart phones and other mobile devices. Conceivably, billions of people have used these chips!
In 1984 Apple Inc. built the Macintosh, the first PC produced with Graphic User Interface, rather than a command-line interface, using images instead of words, essentially. This computer also utilized the mouse, a pointing device that’s revolutionized “picking and choosing” in the cyber world. The Macintosh soon became the industry standard for PCs offering desktop publishing.
Unfortunately, the first Macintosh had little memory, no hard drive and could not be easily expanded. So it was modified over the next two years, producing the Macintosh Plus (1986), which operated at eight megahertz and cost about $2,600.
In the late 1980s to early 1990s, PCs using the Microsoft Windows 3.0 operating system began to dominate the computer market. Windows began offering features that many of us now take for granted: word processing, the text editor Notepad, a macro recorder, a paint program, a calculator, various games and many other programs.
In 1983, Microsoft Office Word was introduced, dramatically expanding the use of word processing at home and in business. And then in 1985 Microsoft Excel was introduced. This versatile commercial spreadsheet application eventually replaced Lotus 1-2-3, once the best in the industry until the middle 1990s.
Using Word and Excel and many other Macintosh and Windows applications as well has made it possible for the average person to work at home using software that is identical to that used at work. This capability has revolutionized education and productivity in the workplace and at home!
For the production of computers used primarily for scientific study, Cray Research, Inc. was established in 1972. The company’s first unit was the Cray-1 supercomputer, which was the fastest computer in the world at the time and sold for over $8 million.
Since Cray computers are very expensive, only elite companies or the governments of rich countries can afford to buy them; therefore, it is a mark of prestige to own one of these marvelous machines. Cray supercomputers produced in the present day have a quarter million processing cores and can perform quadrillions of computations per second!
A multi-core processor is a single processing unit with at least two microprocessors or “cores” used in computations. This configuration allows the multiple cores to run different instructions at the same time, a kind of multi-tasking, thereby making the computer run faster and allowing the added capability of parallel processing. But one problem with parallel processing is that it’s more difficult to write efficient software programs for such complex processing.
Some computers may have more than one CPU, each of which involved with a particular task. This multiprocessing could be called multitasking or multiprogramming, both of which perform numerous tasks at the same time rather than sequentially. Multiprocessing reduces cost and the time it takes to do a job; it also increases reliability, because if one CPU goes down, the others can keep working.
If utilizing computers with multi cores or multiprocessors isn’t enough to do the job, then computers may be linked into a computer grid or cluster, a world-wide one perhaps, creating a kind of super virtual computer, designed to study such complex issues as climate change, financial modeling and earthquake or tsunami simulations. The possibilities for such computer processing are mind-boggling indeed. Who knows what they may one day accomplish!
Computers in the Modern Era
New computers or devices that use them to communicate with others, play games, watch movies or display information for one reason or another, perhaps while shopping for Christmas gifts, seem to arrive on the market every six months or so, and some of these devices include:
2003: The first 64-bit processor, AMD’s Athlon 64, is introduced to the computer market.
2007: The iPhone, one of many smartphones, includes many computer functions only previously available on desktop units.
2010: Apple introduces the iPad, advancing the market for computer tablets.
2015: Apple unveils the Apple Watch, which makes computing power available on one’s wrist.
Future of Computers
Technically, computers can be made of anything. Instead of electronic circuits, tennis balls could be used. The presence of a ball is considered a one and its absence a zero. Therefore, many other types of computers are theoretically possible, including optical computers, DNA computers, neural computers and quantum computers.
In quantum computers, instead of using binary digits, the quantum properties of atoms such as spin, superposition and entanglement would represent data and make calculations accordingly. Of course, since atoms are very small, a quantum computer could be equally minute, revolutionizing miniaturization and also provide invaluable insights into the growing field of nanotechnology.
As humankind continues producing more and more advanced computers, it may soon create computers that can think for themselves, utilizing what’s called artificial intelligence. Then these smart computers may one day create their own computers and perhaps their own computerized people as well. Who wouldn’t want to see that?
Please leave a comment.
© 2012 Kelley
More by this Author
This article summarizes the various techniques designed to produce a more environmentally friendly cement, which could reduce greenhouse gas emissions considerably.
Robots are replacing people's jobs at an ever-increasing rate. Will doctors such as general practitioners be replaced as well?
This article suggests a list of the 10 most polluted rivers in the world. Included in the story are what countries are doing to cleanse these filthy waterways.