Giovetti and Giovetti, Certified Public Accountants

The Firm Meet the Staff Where Are We? News
Gossip New Tax Laws Tax Help Credit Courses
Committees Articles CPE Courses The Computer Show

History of Computers


Three hundred centuries before our calendar begins, Palaeolithic peoples in central Europe record numbers by notching up tallies on animal bones, ivory, and stone. Advanced dating methods show that by 28,000 BC, cave dwellers express themselves through drawing.

In 2600 BC, Man invented the abacus, a device composed of rows of rods with beads. The abacus is though of as one of the first mechanical devices used to compute. An earlier computing device may have been string or rope, where early man tied gnots to keep track of the number of sheep in nomadic heards. Gnotted cords were used in china and mesopotania. The Chinese introduce the first portable digital computer -- the abacus. The exact date of introduction is uncertain, although estimates range from 2600 BC to 190 BC The abacus is used in China for calculating the census as recently as AD 1982.

It is thought that the Babylonians were the first to keep records of business transactions, on clay tablets, over 4,000 years ago. The Egyptians processed data on papyrus, a primitive form of paper, with the use of the calamus, a reed pen. Scribes were considered very important, since they were responsible for the data processing of the empire. The Chinese are credited with the invention of paperwork, carrying bureaucracy to an art. Confucious' sayings are actually rules of conduct for bureaucrats, which was a very complex system of government used by the Chinese. The abacus, a frame of beads, was used by the Chinese as an aid to addition and subtraction. The abacus has appeared in many forms: knotted string, a pebble tray, and the frame of beads. Not only did the Chinese use the abacus, but the Babylonians were known to use the abacus as early as 2200 B.C. Controlling information gave scribes and bureaucrats power.

Around 1500 AD, Leonardo DaVinci created a mechanical computing device. It is thought by some to be the first mechanical computing device created. Schematic plans of the device are contained within DaVinci's extensive drawings.

Napier's Bones, a series of rods made of carved bone with numbers inscribed on them, invented by John Napier in 1610, could be arranged and rearranged to help perform multiplication, similar to multiplication tables used to teach mathematics to children. Napier's Bones were an active form of the table, more like a slide rule. Some Napier's Bones were used to produce the first table of logarithms.


One of the first machines used in data processing was developed in 1645. Blaise Pascal developed the first digital counter, named the Pascaline, at the age of 19, in France. Pascal's gear-driven counting machine consisted of a row of wheels with teeth numbered from zero to nine. Each wheel represented units of ones, tens, hundreds, and so on, similar to a mileage odometer. Pascal invented the counting machine to help his father make calculations associated with collecting the revenue for the government. Pascal's machine was limited to performing multiplication and division through the processes of addition and subtraction, as were most early devices.

Then, in 1670, Gottfried Leibniz improves on Pascal's arithmetic machine by adding multiplication, division, and square root capabilities. In 1679, he introduces binary arithmetic, showing that every number can be represented using only the symbols "0" and "1". Nearly three centuries later, this concept is fundamental to the operation of silicon chip-based computers.


During the early 1800's, Joseph Marie Jacquard, another Frenchman, used punched cards to store instructions to control automatic looms. Each card used a "program" of punched and un-punched holes (a binary system) to select different colored and textured threads in order to produce fabric with different patterns and designs. This technology of computer processing of the production of patterned cloth, tapestry, and carpets is still used today as a less expensive alternative to human labor. Jacquard's card was the forerunner of the standard 80-column computer punch card.


In 1834, Charles Babbage, a British mathematician, attempted to develop a power-driven calculator. He received a government grant to develop a steam-powered machine that contained thousands of gears, wheels, and barrels to make calculations for astronomical tables. The difference engine was supposed to use the differences between previous values in a table to produce new values, thus the name. Although his "Difference Engine" and "Analytical Engine" never worked, he is known as the "Father of Computers" because his concepts are still in use today: Babbage stated that each computer system should include a calculator (central processing unit), a member area (random access memory), input devices, output devices, and stored instructions (programs).


The typewriter was patented in 1868 by Sholes, Glidden, and Soule, and the contract was given to E. Remington and Sons, Manufacturers, in New York, in 1873. The typewriter was the first mechanical work processor. Some of the standards used on the first typewriter are still in use on today's computer keyboards. For example, the "QWERTY" keyboard layout is still in use in computer keyboards and typewriters alike.


In 1891, William S. Burroughs, a bank clerk, invented a key-set adding machine with a crank, which was later electrified. The adding machine contained a grid of number keys to input numbers. This was a significant advancement, because the machine had the ability to record and summarize on a paper tape. The accounting machine was later developed to do bookkeeping, using the same principles developed for the adding machine.


The United States Census is required by the Constitution to taken every ten years. The census of 1880 took six years to be collected and tabulated manually. It was estimated that the 1890 census would require at least 10 years to complete using manual methods.

Herman Hollerith, an employee of the U.S. Census Bureau, developed a manual card puncher, an electronic card reader, and an electromechanical sorting device to tabulate the Census of 1890. These machines were so efficient in processing data that the 1890 census only took three years to tabulate and complete!

Hollerith used the punch card to store information so that it could be used over and over again. Remember, the early automatic looms were the first to use the punch cards to store information to be reused to produce patterns in cloth over and over again in a reliable way. Hollerith used a binary code, where the two condition of punched and unpunched in a series represented characters, numbers, and symbols.

Hollerith later started his own company, which he named the Computing-Tabulating-Recording Company, to manufacture these data processing machines. Hollerith eventually sold his company, which evolved into the International Business Machines Corporation (IBM). Hollerith's punched card became the famous IBM card.

In 1936, John Dvorak, created the DVORAK keyboard designed for ease of use. The keyboard had the most used keys placed in the center of the keyboard for easy use and the least used keys placed in the corners of the keyboard. Unfortunately, the design never really caught on.


The Mark I was the first digital computer, but was nothing more than electromechanical machines combined together. The use of electrical relays and counter wheels was slow, and was restricted by mechanical operation difficulties. The Mark I was developed by Howard Aiken of Harvard University, in conjunction with engineers from IBM, between 1939 and 1944.


ENIAC was the first machine to use electronic tubes for calculating. The Electronic Numerical Integrator And Computer was developed between 1942 and 1946 at the University of Pennsylvania by Dr. John Mauchly, J. Eckert, and their associates. The computer weighed almost 30 tons, contained more than 18,000 vacuum tubes, miles of wire, and required more than 1,500 square feet of floor space. ENIAC was designed mainly for solving problems in ballistics. The switching and control functions, once performed by relays, were handled by vacuum tubes which performed computations a thousand times faster than electromechanical machines. BUG, a term describing a problem in the computer hardware or software, was coined here when a moth was discovered shorting wires inside the computer. The technician recorded in the computer log that, to solve the problem, the computer was "debugged."


UNIVAC I, UNIVersal Automatic Computer, also developed by Mauchly and Eckert, was first delivered to the Bureau of the Census in 1951 and was used for the next 12 years. UNIVAC was the first machine to use magnetic tape to save and retrieve data. Computers were mainly used, up to this time, to process data for scientific, government, or military applications. The first UNIVAC was delivered to a business in 1954 by Remington-Rand Corporation, who had purchased the patent for the first typewriters.


ENIAC and UNIVAC are referred to as first generation computers, because they used vacuum tubes to process data. These computers were produced between 1942 and 1959. Vacuum tubes had tremendous heat problems and required a large amount of electricity to run properly. Although calculations were faster than ever, the vacuum tubes were unreliable and required frequent replacement. Most first generation programming for operating systems and instructions to perform the calculations was done in machine language, and instructions were written in the actual binary digit code of zero and ones.

The second generation of computers were produced between 1958 1964, and used transistors, developed by Bell Labs, to do the calculations. Transistors made computers smaller, more reliable, faster, and cheaper than the first generation computers. As computers became less expensive, more businesses began to use them to process financial information. The IBM 7090 computer is an example of a second generation computer.

Third generation computers, produced between 1964 and 1970, used integrated circuits instead of transistors to process data. Integrated circuits made the third generation computers faster, smaller, more reliable, and cheaper than the second generation computers. More people began to develop applications for computers to make work more efficient. The IBM 360 is an example of a third generation computer. First, second, and third generation computers were built on a frame, and were very large, hence the term main frame computer.

Fourth generation computers, produced since the late 1960's, used the microchip to process and store information or data. Computers became smaller and less expensive. The minicomputer emerged as a unit that many businesses could afford to control their accounting and information processing needs.


In 1972,Intel Corporation introduces the 8008 microprocessor. Nolan Bushnell of Atari introduces Pong, the first major coin-operated electronic video game.

In 1975, Popular Electronics announces the Altair, the first "personal computer". The Altair uses tape recorders as their external storage device and programs take as long as an hour to load off these external tape drives.

In 1975, a company called MITS began to offer kits to build a "microcomputer" based on the Intel 8080 microprocessor. Eventually, other companies began to offer microcomputer kits. In order to build and operate one of these microcomputers, a very thorough knowledge of electronics was required. Another problem with the early microcomputer was the limited capacity to store information for processing. Despite the problems, the kits were a significant step forward for the computer in that, for the first time, the average individual could afford to use a computer for personal use.

In 1977, Steven P. Jobs, who worked for Atari Corporation, and Steve Wozniak, who worked for Hewlett-Packard, left their jobs, sold a Volkswagen bus for startup capital, and organized a company to assemble 50 microcomputers in Wozniak's parents' garage for a local computer store. They called their company Apple Computers, and the machine they produced was the Apple I. The greatest innovation of the Apple computer was the combination of the microprocessor with the floppy disk or streamer tape to store information. The magnetic disk or tape storage unit expanded the capacity of the home microcomputer. The Apple I computer was based on a less expensive 6502 8-bit microprocessor chip, and the computer components were attached to a board made of wood the origin of the term "motherboard." In May of 1977, Jobs and Wozniak began to market the Apple II. Radio Shack and Commodore began to sell microcomputers shortly after the introduction of the Apple II. In the early 1980's, IBM and others began to manufacture more powerful microcomputers based on the new Intel 8088 chip.

In the mid-1970's, Bill Gates, a student at Harvard University, wrote a program for the Apple computer, using BASIC, Beginner's All-purpose Symbolic Instruction Code, which was invented by John G. Kemeny and Thomas E. Kurtz. He soon formed Microsoft Corporation, and went on to develop MS-DOS (MicroSoft Disk Operating System) for the newly marketed and vastly successful IBM personal computer. He later helped develop the operating system for the Apple Macintosh computer (based on the Motorola 68,000 chip). It's no wonder that Bill Gates is now one of the richest men in the world.

MS-DOS is an example of systems software, which consists of programs designed to facilitate the use of the computer by the user. Operating system software is designed to control the input and output operations of the computer, communicate with the computer operator, and schedule the resources of the computer to allow for continuous operation of the computer without manual intervention.

Software is what makes computers work. Without software, computers cannot do anything. By the same token, without computers, software is useless. One of the reasons for the early success of the Apple computer in the home, small business, and education marketplace was the development of the interactive electronic spreadsheet. In the mid-1970's, Dan Bricklin and Bob Fankston released a product called Visicalc, marketed by their company, Software Arts. Visicalc made the computer more useful by providing the computer with a powerful mathematical language which was versatile enough to perform complex mathematical calculations, but was almost as easy to use as a pencil. The concepts invented by Bricklin and Fankston formed the basis for Lotus 1-2-3, developed by Lotus Corporation. Lotus is one of the most successful and widely used application packages of all time. Other application programs include word processors (Word Perfect, Word), data bases (dBase, Access), desktop publishing (Publish-IT!), graphics (Harvard Graphics), and others.


All contents of The Computer Show are Copyright © 1998, 1999 Joppa Computersbacktothetop.gif 1.65 K