The Computer And Its History Essay Research — страница 3

  • Просмотров 296
  • Скачиваний 5
  • Размер файла 21

puched-card or puched-tape input and output devices and RAMs of 1000-word capacity. Physically, they were much more compact than ENIAC: some were about the size of a grand piano and required 2500 small electron tubes. This was a grand improvement over the earlier machines. The first-generation (186) stored-programs computers required considerable maintenance, usually attained 70% to 80% reliable operations, and were used for 8 to 12 years. Typically, they were programmed directly in machine language, although by the mid-1950s progress had been made in several aspects of advanced programming. This group of machines included EDVAC and UNIVAC, the first commercial available computers. (Hazewindus, 155) The UNIVAC was developed by John W. Mauchley and John Eckert Jr. son to John

Eskert who built the ENIAC. Together they formed the Mauchley-Eckert Computer Company Corporation, Americas first computer company in the 1940s. During the development of the UNIVAC, they began to run short of funds and sold their company to the larger Remington-Rand Corporation. Eventually they built a working UNIVAC computer. It was delivered to the U.S. Census Bureau in 1951 where it was used to help tabulate the U.S. population. (Hazewindus, 124) Change Was Good Early in the 1950s, two important engineering discoveries changed the electronic computer field. The first computers were made with vacuum tubes, but by the late 1950s computer were being made with transistors, which were smaller, less expensive, more reliable, and more efficient. (Shallis, 40) These new technical

discoveries rapidly found their way into new models of digital computers. Memory storagecapacities increased 800% in commercially available machines by the early 1960s and speeds increased by an equally large margin. These machines were very expensive to purchase or, to rent and were especially expensive to operate because, of the cost of hiring programmers to perform the complex operations the computer ran. Such computers were typically found in large computer centers-operated by industry, government, and private labs-staffed with many programmers and support personnel. (Rogers, 77) By 1956, 76 of IBM’s large computer mainframes were in use, compared with only 46 UNIVACs. (Chposky, 125) During this time, the major computer manufactures began to offer a range of computer

capabilities, as well as various computer-related equipment. These included input means such as consoles and card feeders; output means such as a page printers, cathode-ray-tube displays, and graphing devices; and optional magnetic tape and magnetic disk file storage. These found wide use in business for such applications as accounting, payroll, inventory control, ordering supplies, and billing. Central processing units (CPUs) for such purposes did not need to be very fast arithmetically and were primarily used to access large amounts of record on file. The greatest number of computer systems were delivered for the large applications, such as in hospitals for keeping track of patients records, medications, and treatments given. They were also used in automated library systems and

in database systems such as the Chemical Abstracts Systems, where computers records now on file cover nearly all chemical compounds. (Rogers, 98) The Computer Gets Cheap, Even the Average “Joe” Can Buy One The trend during the 1970s was, to some extent, away from extremely power, centralized computational centers and toward a broader range of applications for less-costly computer systems. Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, began using computers of relatively modest capability for controlling and regulating their activities. In the 1960s, the programming of applications problems was an obstacle to the self-sufficiency of moderate-sized on-site computer installations, but great advances in applications

programming languages removed these obstacles. Applications languages became available for controlling a great range of manufacturing processes, for computer operations of machine tools, and for many other tasks. (Osborne, 146) In 1971 Marcian E. Hoff, Jr., an engineer at Intel Corporation, invented the microprocessor and another stage in the development of the computer began. (Shallis, 121) A new revolution in computer hardware was now well under way, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scale integration techniques. In the 1950s, it was realized that “scaling down” the size of electronic digital computer circuits and parts would increase speed and efficiency and improve performance. However, at that time