Sunday, July 24, 2011

history of computers


Humans have always needed
to perform arithmetic like
counting and adding. During
the pre-historic period,
they counted either on their
fingers or by scratching
marks on the bones and
then with the help of stone,
pebble and beads. The early
civilization had witnessed
men develop number systems
to keep track of the
astronomical cycles, businesses, etc. The word
'computing' means 'an act of calculating'. Years passed
by, great inventors have invented electronic gadgets
and they have improvised the computers, some of us
now have already computers in their homes used for business and for assignments. And also laptops,
net pads ,and net book which I prefer the most for it's not heavy and it is portable.

Prehistoric man did not have the Internet, but it appears that he needed a way to count and make
calculations. The limitations of the human body’s ten fingers and ten toes apparently caused early
man to construct a tool to help with those calculations. Scientists now know that humankind
invented an early form of computers. Their clue was a bone carved with prime numbers found
in 8,500 BC.
The abacus was the next leap forward in computing between 1000 BC and 500 BD. This
apparatus used a series of moveable beads or rocks. The positions changed to enter a
number and again to perform mathematical operations. Leonardo DaVinci was credited
with the invention of the world’s first mechanical calculator in 1500. In 1642, Blaise Pascal’s
adding machine upstaged DaVinci’s marvel and moved computing forward again.
In 19th century England, Charles Babbage, a mathematician, proposed the construction
of a machine that he called the Babbage Difference Engine. It would not only calculate
numbers, it would also be capable of printing mathematical tables. The Computer History 
Museum in Mountain View, CA (near San Diego) built a working replica from the original
drawings. Visitors can see in the device in operation there. Unable to construct the actual
device, he earned quite a few detractors among England’s literate citizens. However, Babbage
made a place for himself in history as the father of computing. Not satisfied with the machines
limitations, he drafted plans for the Babbage Analytical Engine. He intended for this computing
device to use punch cards as the control mechanism for calculations. This feature would make
it possible for his computer to use previously performed calculations in new ones.
Babbage’s idea caught the attention of Ada Byron Lovelace who had an undying passion for
math. She also saw possibilities that the Analytical Machine could produce graphics and music.
She helped Babbage move his project from idea to reality by documenting how the device
would calculate Bernoulli numbers. She later received recognition for writing the world’s first
computer program. The United States Department of Defense named a computer language in
her honor in 1979.
The computers that followed built on each previous success and improved it. In 1943, the
first programmable computer Turing COLOSSUS appeared. It was pressed into service to
decipher World War II coded messages from Germany. ENIAC, the brain, was the first
electronic computer, in 1946. In 1951, the U.S. Census Bureau became the first
government agency to buy a computer, UNIVAC .
The Apple expanded the use of computers to consumers in 1977. The IBM PC for
consumers followed closely in 1981, although IBM mainframes were in use by government and corporations.
  • 8,500 BC Bone carved with prime numbers found
  • 1000 BC to 500 BC Abacus invented
  • 1642 Blaise Pascal’s invented adding machine, France
  • 1822 Charles Babbage drafted Babbage Difference Engine, England
  • 1835 Babbage Analytical Engine proposed, England
  • 1843 Ada Byron Lovelace computer program to calculate Bernoulli numbers, England
  • 1943 Turing COLOSSUS the first programmable computer, England
  • 1946 ENIAC first electronic computer, U.S.A.
  • 1951 UNIVAC first computer used by U.S. government, U.S.A.
  • 1969 ARPANET Department of Defense lays groundwork for Internet, U.S.A.
  • 1968 Gordon Moore and Robert Noyce found in Intel, U.S.A.
  • 1977 Apple computers for consumers sold, U.S.A.
  • 1981 IBM personal computers sold, U.S.A.
  • 1991 World Wide Web consumer Internet access, CERN, Tim Berners-Lee Switzerland/France
  • 2000 Y 2K Bug programming errors discovered
  • Current Technologies include word processing, games, email, maps, and streaming
The development of network technology and increases in processing capabilities for microcomputers made consumer Internet use possible by 1991. The computer evolution since then continues. New uses emerge every year.