History Of Computer Industry
10 Pages 2545 Words
lack of demand for such a device (Soma, 46). Af!
ter Babbage, people began to lose interest in computers. However, between 1850 and 1900 there were great advances in mathematics and physics that began to rekindle the interest (Osborne, 45). Many of these new advances involved complex calculations and formulas that were very time consuming for human calculation. The first major use for a computer in the U.S. was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched-card system that could automatically read information on cards without human intervention (Gulliver, 82). Since the population of the U.S. was increasing so fast, the computer was an essential tool in tabulating the totals. These advantages were noted by commercial industries and soon led to the development of improved punch-card business-machine systems by International Business Machines (IBM), Remington-Rand, Burroughs, and other corporations. By modern standards the punched-card machines were slow, typically processing from !
50 to 250 cards per minute, with each card holding up to 80 digits. At the time, however, punched cards were an enormous step forward; they provided a means of input, output, and memory storage on a massive scale. For more than 50 years following their first use, punched-card machines did the bulk of the world's business computing and a good portion of the computing work in science (Chposky, 73). By the late 1930s punched-card machine techniques had become so well established and reliable that Howard Hathaway Aiken, in collaboration with engineers at IBM, undertook construction of a large automatic digital computer based on standard IBM electromechanical parts. Aiken's machine, called the Harvard Mark I, handled 23-digit numbers and could perform all four arithmetic operations. Also, it had special built-in programs to handled logarithms and trigonometric functions. The Mark I was controlled from prepunched paper...