Pages

Wednesday, October 9, 2013

A BRIEF COMPUTER HISTORY




The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage.
He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on. 
Generally speaking, computers can be classified into three generations. Each generation lasted for a certain period of
time,and each gave us either a new and improved computer or an improvement to the existing computer. 
First generation: 1937 – 1946 – In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name the Colossus was built for the military. Other developments continued until in 1946 the first general– purpose digital computer, the Electronic Numerical Integrator and Computer (ENIAC) was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum tubes which was used for processing. When this computer was turned on for the first time lights dim in sections of Philadelphia. Computers of this generation could only perform single task, and they had no operating system. 
Second generation: 1947 – 1962 - This generation of computers used transistors instead of vacuum tubes which were more reliable. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). In 1953 the International Business Machine (IBM) 650 and 700 series computers made their mark in the computer world. During this generation of computers over 100 computer programming languages were developed, computers had memory and operating systems. Storage media such as tape and disk were in use also were printers for output. 
Third generation: 1963 - present - The invention of integrated circuit brought us the third generation of computers. With this invention computers became smaller, more powerful more reliable and they are able to run many different programs at the same time. In1980 Microsoft Disk Operating System (MS-Dos) was born and in 1981 IBM introduced the personal computer (PC) for home and office use. Three years later Apple gave us the Macintosh computer with its icon driven interface and the 90s gave us Windows operating system. 
As a result of the various improvements to the development of the computer we have seen the computer being used in all areas of life. It is a very useful tool that will continue to experience new development as time passes. 

Fourth Generation of Computers (1972-1984) 
In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments enabled the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip. 
Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scenario. Names of few such vectors were Cray1, Cray X-MP and Cyber205. A variety of parallel architectures developed too, but they were mostly in the experimental stage. 
As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the compiler or runtime system. Alternatively languages like PASCAL, C used imperative style. Two other conspicuous developments of this era were the C programming language and UNIX operating system. Ritchie, the writer of C and Thompson together used C to write a particular type of UNIX for DEC PDP 11. This C based UNIX was then widely used in many computers. 
Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US. The immediate response to the Lax report was the establishment of NSF Supercomputing Centers. Other centers that came up later were San Diego Supercomputing Center, National Center for Supercomputing Applications, Pittsburgh Supercomputing Center, John von Neumann Center and Cornell Theory Center. These institutes had really been instrumental in providing computing time on super computers to the students, training them and also helping in the development of software packages. 
Fourth Generation of Computers (1972-1984) 
In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments enabled the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip. 
Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scenario. Names of few such vectors were Cray1, Cray X-MP and Cyber205. A variety of parallel architectures developed too, but they were mostly in the experimental stage. 
As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the compiler or runtime system. Alternatively languages like PASCAL, C used imperative style. Two other conspicuous developments of this era were the C programming language and UNIX operating system. Ritchie, the writer of C and Thompson together used C to write a particular type of UNIX for DEC PDP 11. This C based UNIX was then widely used in many computers. 
Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US. The immediate response to the Lax report was the establishment of NSF Supercomputing Centers. Other centers that came up later were San Diego Supercomputing Center, National Center for Supercomputing Applications, Pittsburgh Supercomputing Center, John von Neumann Center and Cornell Theory Center. These institutes had really been instrumental in providing computing time on super computers to the students, training them and also helping in the development of software packages. 
Fourth Generation of Computers (1972-1984) 
In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments enabled the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip. 
Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scenario. Names of few such vectors were Cray1, Cray X-MP and Cyber205. A variety of parallel architectures developed too, but they were mostly in the experimental stage. 
As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the compiler or runtime system. Alternatively languages like PASCAL, C used imperative style. Two other conspicuous developments of this era were the C programming language and UNIX operating system. Ritchie, the writer of C and Thompson together used C to write a particular type of UNIX for DEC PDP 11. This C based UNIX was then widely used in many computers. 
Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US. The immediate response to the Lax report was the establishment of NSF Supercomputing Centers. Other centers that came up later were San Diego Supercomputing Center, National Center for Supercomputing Applications, Pittsburgh Supercomputing Center, John von Neumann Center and Cornell Theory Center. These institutes had really been instrumental in providing computing time on super computers to the students, training them and also helping in the development of software packages. 
Fifth generation of computers (1984-1990) 
In this period, computer technology achieved more superiority and parallel processing, which was until limited to vector processing and pipelining, where hundreds of processors could all work on various parts of a single program. There were introduction of systems like the Sequent Balance 8000, which connected up to twenty processors to one shared memory module. 
This machine was as competent as the DEC VAX-780 in the context that it had a general purpose UNIX system and each processor worked on a different user's job. On the other hand, INTEL IPSC-I or Hypercube, as it was called, connected each processor to its own memory and used a network interface to connect the processors. With the concept of distributed network coming in, memory posed no further problem and the largest IPSC-I was built with 128 processors. Towards the end of the fifth generation, another parallel processing was introduced in the devices, which were called Data parallel or SIMD. In this system, all the processors operate under the instruction of a single control unit. 
In this generation semiconductor memories became the standard were pursued vigorously. Other developments were the increasing use of single user workstations and widespread use of computer networks. Both wide area network (WAN) and local area network (LAN) developed at an incredible pace and led to a distributed computing environment. RISC technology i.e. a particular technique for the internal organization of CPU and the plunging cost of RAM ushered in huge gains in computational power of comparatively cheaper servers and workstations. This generation also witnessed a sharp increase in both quantitative and qualitative aspects of scientific visualization. 

No comments: