Definition, Generation of Computers and Their Used Technologies

Definition, Generation of Computers and Their Used Technologies
Definition, Generation of Computers and Their Used Technologies

Definition of Computer

Computer:  The word computer is taken from the Latin word “compute” which means “calculate”.

According to the Oxford English Dictionary, the word “computer” was first used to describe a mechanical calculating device in 1897.

A computer is an electronic and programmable device used to solve a lot of problems in our daily life. It is used to store and process data to make a decision.

Generation of Computer

Historical Evolution of Computers

Father of modern computers – Charles Babbage (British mathematician)
Invented computers (mechanical): Analytical Engine, Difference Engine.

Generation of Computer: The gradual development of present computers over past computers are divided into different time periods. Each time period is considered as a generation of computer.

The parameters which are used to measure the development of present computers over past computers are:

Space: Space must be minimized.
Speed: Speed must be maximized.
Capacity: Capacity must be maximized.
Cost: Cost must be minimized.

Different Generation of Computers and Their Used Technologies

Generation Period Technology
First Generation Computers 1942–1955 Vacuum Tube
Second Generation Computers 1956–1965 Discrete Transistors
Third Generation Computers 1966–1975 Integrated Circuits (ICs)-SSI
Fourth Generation Computers 1976–1985 MSI , LSI
Fifth Generation Computers 1986–onwoards VLSI, VVLSI (ULSI)
IC Chip

(1”x0.5”)

Full meaning No. of Discrete Transistors (Approximately)
SSI Small Scale Integration 1000
MSI Medium Scale Integration 10,000
LSI Large Scale Integration 100000
VLSI Very Large Scale Integration 10,00000
VVLSI Very Very Large Scale Integration 10000000
ULSI Ultra Large Scale Integration 10000000

Be the first to comment

Leave a Reply

Your email address will not be published.


*