The first individual associated with the beginnings of the history of the computer is Al-Kwarismi, who studied a set of logical rules, which are now called algorithms (yes, they are named after him). These rules were used to aid mathematicians from that point on, and later as the basis for the first programming and logic behind the first computer-like machines. In 1642, Blaise Pascal, using algorithms as a basis, designed the first mechanical adding machine. This machine was improved upon in the 1970's by Gottfried Wilhelm von Leibniz, who improved it to multiply also. In the first third of the nineteenth century, Charles Babbage, an English born inventor and mathematician, developed a machine that could solve equations by calculating the difference between them. His well received demonstration model of his "difference engine" won him support, and a grant from the British government. Unfortunately Babbage found that his machine could be thrown off by very small imperfections, and soon after the British government withdrew their financial support. Over a century and a half later Joseph Jacquard designed a loom that used punch cards as a pattern to weave intricate patterns. This loom served as a basis for later computer-like machines, which used punch cards as input. George Boole, in the 1840's developed a system of logic, in which mathematical symbols could be applied to logic, later termed Boolian algebra. This type of algebra is still used in electronics and some intercomputer software. About ten years later, Charles Babbage, not discouraged by his failure with his "difference engine", designed the first general purpose computer, called the "analytical engine". This machine had five key functions that designates it as a computer... An input device A storage place to hold the number waiting to be processed A processor, or number calculator A control unit to direct the task to be performed and the sequence of calculations An output device It's purpose was to solve various mathematical equations. Babbage, although, never produced his machine because of the frequent technical difficulties that he faced. In fact, it was not until 1991 that an analytical engine was successfully produced. Only a few years later, Ada of Lovelace, daughter of Lord Bryon and a mother skillful in mathematics, wrote the first computer program. Ada was in constant communication with Charles Babbage, and learned much from his studies. Also Ada aided Babbage with his analytical engine, and believed that his machine was workable. She supported Babbage's work on his machines and published a series of notes that aided followers of Babbage continue what Charles started. In the 1890's Herman Hollerith used the idea of punch cards that Joseph Jacquad developed for his looms, to make a tabulating machine. This machine was developed to enter a contest to develop a machine to tabulate the 1890 census, which before took seven and a half years. His entry won the grand prize of $1,000, after it shortened the tabulating time to six weeks. The primary difference between Hollerith's tabulating machine and Babbage's analytical engine was the fact that Hollerith's used electrical rather than mechanical power. Hollerith realized the commercial potential of his machine, and in 1896 became one of the founding members of a company, the Tabulating Machine Company. In 1924 his company merged with two others, and formed the International Business Machines Corporation, or IBM. IBM was run by Thomas J. Watson, Sr. from 1924 to 1956, and did so firmly. He transformed IBM into a dominant force in the business world. IBM's first big product was the calculator, which evolved into the computer. Howard Aiken, a Harvard professor of mathematics, began in 1936 to be interested in Charles Babbage's analytical engine, after reading the notes that Ada of Lovelace published. After a presenting IBM with a carefully planned proposition, Aiken was allowed $1,000,000 and thus began the Harvard Mark I. The Mark I was the most impressive machine built to date, and stood eight feet high, and fifty-five feet long. It was constructed of steel and glass, and was described as sounding "like listening to a roomful of old ladies knitting away with steel needles." The Mark I was released in 1944, and gained huge publicity, which aided IBM's commitment to computer development. Elsewhere, the University of Pennsylvania to be specific, Dr. John Mauchly was developing a machine to calculate the trajectories for artillery and missiles for the American military. He, and student J. Presper Eckert referred to the work of a professor of physics at Iowa State University, Dr. John V. Atanasoff. John Atanasoff built the first digital electronic computer along with his assistant Clifford Berry. Their creation was termed the Atanasoff-Berry computer, or nicknamed the ABC. In 1941 Atanasoff and Berry met with Dr. John Mauchly, who used the ABC as the basis for his work that took computers into the next step. When Mauchly built a commercial version of his machine, Atanasoff and Berry halted him in his attempt to obtain a patent for his computer. After a controversial federal court decision, Atanasoff and Berry were determined to be the originators of the ideas associated with Mauchly's machine. Mauchly's controversial work produced the first general purpose computer. His machine was the first standard of early computers; the Electronically Numerical Integrator and Calculator (ENIAC). The ENIAC was developed using the ideas of the ABC and adding and expanding upon them. These first electronic computers could process two ten-digit numbers in one fortieth of a second and three hundred numbers per second. Unfortunately, the vacuum tubes broke very often, and the Mean Time Between Failure was 10-15 minutes. It took 15 men just to keep the vacuum tubes replaced with fresh tubes. The ENIAC occupied 1500 square feet, and weighed thirty tons. To buy one of these machines would cost around $500,000 in 1946 dollars. John Von Neumann introduced the concept of having a stored program in memory In 1945. He, in doing this, eliminated the time-consuming task of rewiring the computer every time that a new program was used. This concept led to the beginning of the first generation of computers, which began in 1951. In 1951, the first generation of electronic computers began with the development of vacuum tubes, which were electronic tubes similar in size to light bulbs. Vacuum tubes were used as the internal components of the computer. The tubes heated quickly, and therefore caused many problems concerning temperature control. Also the tubes burnt out often, and had to be constantly replaced. Language was another problem with the early computers. The language was machine language, which was written in numbers. This made programming extremely difficult and time consuming. Today's languages are much more similar to English, and the computer converts the source code (written by the programmer), into machine language which instructs the computer what to do. Magnetic core was used for memory, which were pinhead sized rings strung on thin intersecting wire like beads. In 1957 magnetic tape, similar to tape in an audio or video tape, were developed for use in computers. This tape provided a much quicker and more reliable means of storing information on a computer. Punch cards, like those that Herman Hollerith developed in the late nineteenth century, were used as secondary storage throughout the first generation of computers. The prime example of a first generation computer is the Universal Automatic Computer or simply UNIVAC. It used all of the new technology to form the commercial version of the ENIAC. CBS invested in one of these computers, and used it to predict the 1952 election. They withheld their predicted results until they were sure that the computer had predicted accurately, only then did CBS announce that their UNIVAC predicted that Dwight Eisenhower would be the president. The second Generation of electronic computers, which was smaller than the first, began in 1959 with the development of the transistor (electronic switching device). The transistor was developed by three Bell Lab scientists' J. Bardeen, H.W. Brattain, and W. Shockley, who won a Nobel prize for their invention. A transistor transfers electric signals across a resistor, thus "transistor" is derived from the joining of the two words transfer, resistor. Computers were revolutionized by the transistor due to the fact that transistors were much smaller than a vacuum tube, their predecessor. This, in turn, decreased the size of computers dramatically. Also transistors, as opposed to vacuum tubes, needed no warm-up time, used less energy, and were faster and more reliable. Second generation machines used the newly developed assembly language (also called symbolic language), which used symbols (ex: L for Load), rather than numbers (binary). These languages made programming much easier. High-Level languages were built from these basic symbolic languages and increased the ease of programming. FORTRAN (Formula Translator), was released in 1954, and COBOL (Common Business Oriented Language), was released five years later. These languages are both still used today (in updated versions), and are more similar to English that the assembly languages. In 1962 the first removable disk was developed for public use. This allowed users to store and access files quicker and without having to worry about filling up their magnetic tape space. The Second generation of computers opened the market of computers to the public by drastically dropping the cost of computers, and decreasing their size. As a result, an increasing number of computers were being used by businesses, schools, and the general public. In 1965 Santa Clara County was a small residential area about half an hour's drive from San Francisco. By the end of the next decade, it was transformed into the Silicon Valley, the center of production for the tool of the computer era: the integrated circuit, or the "silicon chip" The third generation of electronic computers began in 1965 with the invention of the integrated circuit. An integrated circuit is a complete circuit put on a silicon chip. The silicon acts as a semi-conductor which is a crystalline substance that will conduct electricity when "doped". This new technology drastically reduced the size of a computer by replacing whole circuit boards of transistors with chips smaller as 1/8 of an inch. These silicon chips were a huge breakthrough in the computer world due to their small size, reliability, and low cost. IBM created the most significant advance in computing in this era, the system 360, which earned IBM it's nickname "Big Blue". IBM began marketing computers as a business tool, and opened a market that evolved into a multi-billion dollar industry. The fourth generation of electronic computers began in 1971, and is the generation that we are presently in. The major technical advance that characterizes the fourth generation in the tiny Microprocessor. The fourth generation is actually an extension of the third, building on the integrated circuit technology. Large Scale Integration (LSI), and later Very Large Scale Integration (VLSI), are advances that occurred in this generation also, and also much more practical than early computers such as the ENIAC, because they are at least 100 times smaller. These technologies are used in electronics such as those in kitchen appliances, cars, entertainment devices, home security, calculators, and exercise equipment. The fifth generation of electronic computers has not begun yet. It will contain computers that can learn, logic and reason, In other words, these computers will have artificial intelligence.
|
|