Lompat ke konten Lompat ke sidebar Lompat ke footer

Computer Evolution

Based on the advances leading to each stage of technological progress, computer systems have commonly been grouped into four generations:
• First generation: vacuum tube circuitry
• Second generation: transistors
• Third generation: small and medium integrated circuits
• Fourth generation: large-scale integration (LSI) and very large scale integration (VLSI)
• Fifth generation: near future

The first generation of computers, such as the Mark I and the ENIAC in the 1940s, were huge machines in terms of both size and mass. The ENIAC computer at the University of Pennsylvania in Philadelphia was constructed during World War II to calculate projectile trajectories. The circuitry of first-generation computers was composed of vacuum tubes and used very large amounts of electricity (it was said that whenever the ENIAC computer was turned on, the lights all over Philadelphia dimmed). ENIAC weighed 30 tons, occupied 15,000 ft2 of floor space, and contained more than 18,000 vacuum tubes. It performed 5000
additions per second and consumed 40 kW of power per hour. Also, due to the vacuum tube circuitry, continuous maintenance was required to change the tubes as they burned out. Input and output functions were performed using punched cards and separate printers. Programming these computers was tedious and slow, usually performed directly in the binary language of the computer.

The second generation of computers was developed in the 1950s. These computers used
transistors as a replacement to the vacuum tubes of their predecessors, decreasing maintenance
requirements as well as electricity consumption. Information was stored using magnetic
drums and tapes, and printers were connected online to the computer for faster
hard-copy output. Unrelated to hardware considerations was the development of programming
languages that could be written using more readily understandable commands and then
separately converted into the binary data required by the computer.
Just as important as the development of new computer languages was allowing
programmers to write with words, phrases, and mathematical formulas rather than in binary
machine code. These languages featured the use of a compiler that used another program to
translate the more human programming language into machine code the computer could
understand. Between these improvements, the computer became a profitable tool for businesses.
Based on these improvements, the computer industry, as we know it, was born with
the second-generation computers. One problem that plagued second-generation computers
was the dissipation of heat from its transistors.
The third generation of computers came about in 1964 with another major advancement
in hardware and software. The first integrated circuit, which combined three separate electronic
components into a single chip, was created at Texas Instruments and marked a major
advance in hardware technology. Since then, both the number of components on a single
chip and the chip’s power have continued to rise, while the chip size has decreased.
The software advance of the third generation allowed users to run multiple programs at
a time through the use of an OS. This OS would act as a master control program and
coordinate the operation and use of system resources. This advancement came as a significant
leap forward from the first- and second-generation computers that had to load each individual
program before running them separately.
High-level software languages such as COBOL, FORTRAN, and BASIC were developed
at this time and gained popularity. These languages were written in a way that the programmer
could more readily understand and assemble automatically into a set of instructions for
the computer to follow. The most significant development of this period was a downward
cost spiral that precipitated the popularity of minicomputers—smaller computers designed
for use by one user or a small number of users at a time, as opposed to the larger mainframes
of previous generations.
The fourth generation of digital computers began around 1971. The steady decrease in
processing time and cost for computer technology has continued with a corresponding increase
in memory and computational capabilities. With LSI, more than 1000 components
can be placed on a single integrated-circuit chip. VLSI chips contain more than 10,000
components, and current VLSI chips have 100,000 or more components on each chip. The
semiconductor technology developed in the 1970s condensed whole computers into the size
of a single chip known as a microprocessor. Semiconductors were responsible for the arrival
of ‘‘personal computers’’ in the late 1970s and early 1980s.
The mark of this generation has been the ability to place more and more electronic
components onto an increasingly smaller chip until today’s circuits have millions of components
fitting on a surface no larger than that of a shirt button. In the mid-1970s, computers
had progressed enough to be reasonable for use in the common household as well as being
capable of performing word processing, database manipulation, and financial work. By the
1980s, computer users no longer had to have prior programming knowledge. Today, they
have become as common as the household TV or oven.
The fifth-generation computers have long been seen as ‘‘just around the corner.’’ There
are several technological factors indicating that the next generation is on our doorstep. The
first is the concept of artificial intelligence in computers that would allow the computer to
better mimic expert human thought. Next, there is the possibility of parallel processing, or
the use of two independent central processors to work together to significantly increase the
power of the computer. Also, the potential for superconducting materials presents a great
opportunity for computers with their ability to offer extremely low resistance in electrical
flow, thus increasing the speed of information flow throughout the computer.
The separation of the fourth- and fifth-generation computers is certain to be obscure,
and it is still uncertain as to when the fifth generation will begin.

Emory W. Zimmers, Jr. and Technical Staff
Enterprise Systems Center
Lehigh University
Bethlehem, Pennsylvania