June 4, 2012

A Brief History of Computer Hardware

We have come a very long way in our ability to count things and make computations. Counting on fingers and toes was pretty reliable up to twenty, assuming you possessed all your original digits. But times and events required we go higher than that.

The abacus, introduced around 300 BC sped counting things up and got us thinking about numbers of a significantly higher order. There has always been a need and interest in arriving at an answer to a question or problem faster for every generation.

Perhaps at one point time was not money, but it is today and getting correct information as rapidly as possible is more important than ever. Charles Babbage’s analytical engine was an enormous step forward, if a bit unwieldy, its fifty thousand moving parts the size of a steam locomotive, toward what has evolved into our commonly-held idea of a computer today.

While Babbage came up with his idea around 1833, it was about one hundred years later that we entered the first generation of what has developed into our modern computer hardware. In 1951, Univac became the first commercial computer developed for a business client. For circuitry it used vacuum tubes and for memory, magnetic drums. Input relied on punch cards and it could only solve one problem at a time.

In the next generation, transistors replaced vacuum tubes. Now computers could be smaller, cheaper, faster and more dependable. Here also we see the beginnings of programming languages being developed.

The following generation introduced semiconductors, miniaturized transistors on silicon chips. These integrated circuits dramatically increased efficiency and speed. It also enabled computers to become available to a mass audience.

The microprocessor was the hallmark of the following generation of computers. Now a single silicon chip could contain thousands of integrated circuits on it. This led to the forming of computer networks and further saw the creation of the Internet.

As for what lies beyond, nanotechnology, artificial intelligence and computers that learn and think for themselves may not be in the far distant future.

Speak Your Mind