A Brief History of Computers – They’re Older Than You Think

, , Leave a comment

For those who are not familiar with the history of computers you might get the impression that they are very new and just came out recently. But really they have been around for a long time. The earliest modern computers actually came out during the 1940s.

computer A Brief History of Computers – They’re Older Than You Think1940s – 1950s

The earliest computers were huge and complex, filling up entire rooms and they relied on vacuum tubes. One of the earliest computers made was the ENIAC. While it is digital, it was primitive by today’s standards, and way too big. The ENIAC weighed 27 tons and took up 167 sq m of space. The machine also used thousands of components, capacitors, resistors, vacuum tubes and other components.

1955-1960

This period is known as the second generation of computers and is notable for the creation of the transistor. Unlike vacuum tubes, transistors were smaller, consumed less power and did not generate as much heat and more reliable. It was in 1953 when the first transistor computer was created in Manchester University and this was fooled by the IBM 401. In 1956 IVM invented the disk drive.

1960s

This 1960s saw the third generation of computers emerge and was marked by the creation of the microchip (integrated circuits). It is the microchip that made it possible for computers to be built the way they are now. Manufacturers were now able to make circuits from silicon components. Unlike the earlier generations, integrated circuits made it possible for companies to make powerful but smaller microchips and less expensive.

It also became possible for manufacturers to pack in more transistors in a single chip, boosting its power and capabilities without making it physically larger. It was in the 1960s that microchips began finding their way into electronic devices. The earliest models to use these components were minicomputers.

The earliest units were hybrid and used microchips and transistors. The most prominent example of this hybrid machine was the System/360 by IBM. These compeers became known as mainframes. The minicomputer on the other hand, is the bridge linking microcomputers and mainframes.

1970s to the Present

By the 1970s CPUs had emerged and they were made up of numerous microchips, but it eventually led to integration and the development of single chip CPUs. With the single chip CPU all parts were merged in one microchip which would become known as the microprocessor.

The first single chip microprocessor was the Intel 404. This would lead to the invention of numerous microcomputers like the Altair 8800. But these bore little resemblance to modern computers since they were sold in kits and had to be assembled.

The GUI

The next revolution is the graphical user interface and the mouse which were demonstrated by Doug Engelbart at Stanford in 1968. But computer companies did not pick it up initially and the idea of the GUI was instead used by Xerox for their Xerox Alto computer in 1973.

And that’s where it would have stayed had not Steve Jobs visited Xerox and seen the GUI and its potential. Inspired he created a GUI for Apple computers and the Macintosh was born.

While it was not successful initially, other companies saw its potential. Microsoft would develop its own GUI for the PC called Windows. With the release of Windows 3.1, the history of computers would be forever changed.

Author Bio: Sam is a cool writer of http://www.historyofcomputer.org/ and he loves to share his knowledge on computer history.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

*

HTML tags are not allowed.