My speciality (development)
The first mechanical calculating machine was developed by Blaise Pascal in the mid- 1600s. It was based on the decimal system and operated by a series of rotating gears.
Charles Babbage was the first to conceptualize the modern computer. The machine he first designed was the difference engine. Later, in collaboration with Augusta Ada Byron, he designed a machine called the analytical engine, which was the first machine to use the binary number system. The machine also processed information using punched cards, a concept taken from Joseph Marie Jacquard who had programmed a loom with punched cards in the early 1800s.
In the 1880s, Dr. Herman Hollerith developed a mechanical method of tabulating census results. His machine read and sorted data coded on punched cards in Hollerith code. Electrical circuits were completed when brushes passed over holes punched in the cards. Hollerith formed the Tabulating Machine Company, which later became IBM.
Howard H. Aiken built the Mark I in the 1930s. The Mark I was a primitive design, but the U. S. Navy used it to do ballistics calculations through the end of World War II.
The first electronic digital computer was the Atanasoff-Berry Computer (ABC). It used vacuum tubes for storage and arithmetic-logic functions.
ENIAC was the first general-purpose electronic computer. It had 18,000 vacuum tubes. It was used to study weather, cosmic rays, and atomic energy. Scientists of the time felt that seven ENIACs could supply all the computing power the world would ever need.
The EDSAC was the first stored-program computer. The EDVAC was finished a short time later. These were the first calculating machines that could run without human intervention.
Early computers were programmed by physically arranged wires and switches. A code, called machine language, was developed to correspond to the on and off electrical states needed to enter instructions to these computers. Machine and assembly languages are known as low-level languages; they give the programmer control over computer hardware, memory locations, and I/O operations. Machine language uses Is and Os to program the computer. To overcome the difficulty of working with Is and Os, assembly language was developed. Mnemonics are used to specify instructions in assembly. An advantage of working with low-level languages is speed of execution.
The UNIVAC I was a first-generation computer. The IBM 650 was developed to compete with the UNIVAC I.
During this time, Commodore Grace Murray Hopper of the U.S. Navy developed the first language-translator program. It used mnemonics to represent the 0s and 1s of machine language. The first high-level programming languages, starting with FORTRAN, were developed shortly thereafter.
Four advances in hardware led to the second-generation computers: the transistor, magnetic core storage, magnetic tapes, and magnetic disks. The transistor replaced vacuum tubes making the new computers smaller and much more reliable.
In the late sixties, Jack S. Kilby developed the integrated circuit. These circuits were much more reliable than vacuum tubes and provided more computing power and speed than the original transistors.
The integrated circuit continued to be improved by packing more and more circuits onto a single silicon chip. This process became known as large-scale integration. Ted Hoff developed a "computer on a chip," called a microprocessor, which allowed everything from toasters to space ships to be computerized. The microprocessor, the principal stepping stone into the fourth generation of computers, spawned the microcomputer. John Roach of Radio Shack, Jack Tramiel of Commodore, and Steven Jobs and Stephen Wozniak of Apple are among the innovators in the field of microcomputers. Recently, very-large-scale integration has lead to the development of extraordinarily fast supercomputers.
A proposed fifth generation of computers based on artificial intelligence would allow computers to imitate human characteristics such as creativity, judgment, and intuition. This technology is still very much in the experimental stages, however.
Since the advent of the microcomputer, people have used them imaginatively for a host of applications. They have been used for designing weaving patterns, fighting arson, and composing music among other things. With the technological advances and market expansion occurring now, there is no telling what the future of microcomputing might hold.