The modern computer, which we are using today, is not the invention of a single individual. It is the result of countless inventions, ideas and developments made by many people throughout the last several decades. The history of automatic data processing begins with Charles Babbage’s attempt to build a mechanical calculator at Cambridge, England, in 1830. By the end of 1930, punched cards were in wide use in many businesses. In 1937, Howard Aiken, at Harvard, proposed to IBM that a machine could be constructed which would automatically sequence the operations and perform calculations. That was the origin for the development of automatic computers.
First Generation Computers: (1946 to 1959)
- UNIVAC (Universal Automatic Computer) was the first general purpose electrical computer. These computers employed vacuum tubes. These computers were large in size and required air conditioning (as they produce huge heat).
- The input and output units were punched cards. The input and output devices were very slow when compared to the speed of CPU.
- They were very expensive.
- The medium of internal storage was magnetic drum.
- Language used was Machine level language.
- Processors speed used to be measured in milliseconds. IBM-650 was the most popular first generation computer and was introduced in 1950 with magnetic drum memory and punched cards for input and output respectively. It was intended (=designed) for both business and scientific applications.
Second Generation Computers: (1959 to 1965)
- These computers employed transistors and other similar devices.
- Their circuits were smaller than vacuum tubes and generated less heat. Hence they required less power, were faster and more reliable. There were two separate categories of second generation computers – for business & scientific applications.
- IBM 1401 was the most popular second-generation computer.
- They employed magnetic tape as input/output media.
- Main medium of internal storage was magnetic core memory.
- Language used was Assembly language.
- Processor speed started to be measured in microseconds.
Third Generation Computers: (1965 to 1970)
- They employed integrated circuits, in which all the elements of an electronic circuit are integrated (=included) on a tiny silicon wafer.
- They were much cheaper and more reliable than second-generation computers.
- Their speed is high and can support variety of peripherals.
- They are based on the principles of standardization and compatibility.
- The secondary storage of a particular computer can be easily expanded.
- They can be used for both scientific and business applications.
- They permit multi-programming, time-sharing, virtual memory and remote terminals.
- They also support high level languages such as FORTRAN and COBOL.
- Mainframes, Mini computers are also one of the developments in third generation computers.
- Limited communication facilities were also available.
- Processors speed started to be measured in nanoseconds.
Fourth Generation Computers: (From 1970)
- They appeared in 1970’s. They utilized still newer electronic technology and made the computer still smaller and faster than third generation computers.
- Many new types of terminals were also developed at this time.
- One of the major inventions was the large scale Integrated Circuit (LSI). It is a small “chip” consisting of thousands of small electronic components, which function as a complete system. With this technology the entire CPU can be built onto a single chip of size less than 1/3 inch square. This technology helped to reduce cost and increase speed.
- The speed of microprocessors, the size of main memory and hard disk increased tremendously. Many of the features of mainframe CPU’s were introduced in these computers. In 1995 the most popular CPU was Pentium.
- These computers are being used in various areas like visualization, Parallel computing, virtual reality, Multimedia etc.
- Object oriented languages were introduced such as C++, VB, etc.
- Microcomputers and supercomputers are introduced at this time.
Fifth Generation Computers:
- It is very difficult to define the fifth generation computers because it is still under development.
- Introducing “Artificial Intelligence” to computers is the major development in this generation.
- Artificial Intelligence is a software that tries to imitate (=copy) human characters such as reasoning, communicating, seeing and hearing etc. This software can use its accumulated knowledge for decision making. In some cases, these systems can learn from past experiences and can modify its subsequent actions.
- This artificial intelligence is being used in several areas such as:
- Natural languages,
- Imitating human voice,
- Voice recognition
- Visual recognition,
- Translating from one language to another language
- Robotics,
- Neural networks
- Expert systems.
- At first all these applications seemed to be (=looked to be) very simple. When programmers started working, they realized the difficulty.
- It will take several years for the development of such systems.