Computer Research Paper

Academic Writing Service

View sample computer research paper. Browse other  research paper examples and check the list of history research paper topics for more inspiration. If you need a history research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our custom writing service for professional assistance. We offer high-quality assignments for reasonable rates.

Humans have always looked for “technologies” to help them count—from stick-markings prehistoric foragers made to keep track of cattle to the first programmable, room-filling mainframes employed by post–World War II business. In the twenty-first century computers do far more than calculate; forecasters predict that the computing power of today’s desktop will someday be packaged in a device the size of a shirt button and for the cost of a dime.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


Computers have transformed work, communication, and leisure activity, and they promise future changes of equal magnitude. For many years, computer technology was dominated by groups in the United States, because that nation had the largest single market and its government invested heavily in military applications and fundamental science and engineering. But many nations contributed to the technological basis on which computing arose, and with the development of the World Wide Web computing became a global phenomenon.

Mechanical Predecessors

Programmable digital computers were developed just before the middle of the twentieth century, but the more general history of devices that help people think goes back to prehistoric times, when someone first carved marks on a stick to count the cattle in a herd or mark the days in the phases of the moon. Complex additions and subtractions were done in ancient days by arranging pebbles in piles on the ground, and our word calculate derives from the Latin word calculus (pebble). The most complex known “computer” of classical civilization is the remarkable geared Antikythera device, which apparently was designed to predict the motions of the sun, moon, and planets. Found in a shipwreck on the bottom of the Mediterranean Sea, it is believed to date from about 80 BCE.




Computing has always been closely allied with mathematics, and the invention of logarithms by the Scottish mathematician John Napier around 1614 was a major advance for practical calculating. With a mechanical calculating device, it is much easier to add than to multiply, and subtraction is much easier than division. Logarithms turned multiplication into addition, and division into subtraction, at the cost of looking up numbers in vast books of tables that also had to be calculated by hand. From Napier’s time until the introduction of transistorized electronic calculators around 1970, a book of logarithm tables was a standard tool for engineers and scientists. They were cumbersome to use, so for quick estimates slide rules were employed. A slide rule is an analog calculating device based on logarithmic scales marked along rulers that slide past each other. The term analog refers to the analogy between the abstract numbers and corresponding physical distances along a line.

Digital mechanical calculators that represented numbers as precise digits were also developed—for example, by the French mathematician and philosopher Blaise Pascal in 1642. A common approach was to connect a series of wheels, each of which would turn in ten steps for the digits 0 through 9. A legend has developed that the eccentric English dilettante Charles Babbage was the father of computing because around 1835 he designed a mechanical calculator that could be programmed with punched cards. Science fiction writers William Gibson and Bruce Sterling wrote a novel imagining that Babbage succeeded in building it, launching a golden age of British scientific and technological dominance but magnifying social problems. However, in reality Babbage failed, and historian Doron Swade estimates that his influence on the development of electronic computers was insignificant.

The first comprehensive digital data-processing system using cards was developed by the American engineer Herman Hollerith, who began patenting his ideas in the 1880s. By 1902, when his machines were used to process the vast sea of information collected in the 1900 U.S. census, they already incorporated electric relays that could do conditionals (if-then operations).

The Mainframe Era

There is considerable debate among historians over which programmable, electronic digital computer was first or most influential. By 1941, professor John Atanasoff and graduate student Clifford Berry had created a demonstration machine at Iowa State University, but they did not develop it further. In Britain, a special-purpose electronic computer called Colossus began cracking German codes in 1943, but its design was kept secret for more than three decades. Perhaps the most influential early electronic digital computer was ENIAC (Electronic Numerical Integrator and Computer), completed at the University of Pennsylvania in 1946 by a team headed by physicist John W. Mauchly and engineer J. Presper Eckert.

ENIAC’s primary job was calculating accurate artillery firing tables for the U.S. Army. In the frenzy of World War II, many new models of long-range guns were being produced, and soldiers in the field needed complex tables to tell them how to aim to hit a target at a certain distance under various conditions. It was impossible to fire the guns under all the likely conditions, so data from some judiciously chosen test firings were used to anchor elaborate sets of mathematical calculations. Vannevar Bush, who was the chief science advisor to President Roosevelt, had a huge mechanical analog computer, the differential analyzer, built for this purpose in 1930. In theory, an electronic computer would be much faster and more accurate, but there were serious questions about whether it could be sufficiently reliable, because before the development of transistors they were built with vacuum tubes that tended to burn out. ENIAC weighed 30 tons, covered 1,800 square feet, and contained 18,000 vacuum tubes.

ENIAC’s data input and output employed Hollerith’s punch cards, a method that remained one of the standard approaches through the 1970s. However, programming was done manually by setting hundreds of rotary switches and plugging in wires that connected electronic components. Mauchly and Eckert designed a successor that could store a program in its memory. They formed a small company, launched a line of machines called UNIVAC, and then sold out to a private company in 1950. This example typifies mid-twentieth-century computing. The technology for large and expensive mainframe computers was developed with government funding for military purposes and then transferred to the civilian sector where it was used by large corporations for financial record-keeping and similar applications. Much of the research work was done at universities, and the availability of a few mainframe computers on campus gave scientists the chance to adapt them to many research purposes.

The Personal Computer

The birth of the computer industry involved nothing less than development of an entire computer culture, including programming languages and compilers to control the machines, networks and input-output devices to transmit information between users and machines, and new courses in universities leading to the emergence of computer science and engineering as a distinct field. For years, the dominant model was expensive mainframe computers with batch processing of data—computer runs that were carefully prepared and then placed in a queue to await time on the mainframe—although there were some experiments with time sharing in which several individuals could use a computer simultaneously in real-time. Then, in the mid-1970s, both inside information technology companies and outside among electronics hobbyists, the personal computer revolution offered a radically new concept of computing.

In April 1973, Xerox corporation’s Palo Alto Research Center ran its first test of the Alto, the prototype desktop personal computer. Alto innovated many of the technologies that would become standard for home and office computers, including the mouse, windows and icons on the screen, desktop printing with many different fonts, incorporation of images and animations, and local area networks that allowed individuals to send files back and forth between their machines. Xerox was not able to exploit the technology at the time, because of the high cost and low performance of microelectronics. In the 1960s, Gordon Moore, a founder of the Intel computer chip corporation, propounded what has become known as Moore’s Law, the observation that the performance of computer chips was doubling every eighteen or twenty-four months. Alto’s technology finally hit the home market when the first Apple Macintosh was sold in 1984, soon followed by Microsoft’s Windows operating system.

Before any of the big information technology companies offered personal computers to the public, hobbyists were building their own from kits, notably the Altair first announced in the January 1975 issue of Popular Electronics magazine. A technological social movement, drawing on some of the cultural radicalism of the 1960s, quickly spread across America and Western Europe, although in retrospect it is difficult to estimate how much this radicalism contributed to the rapid advance of the computer revolution. It is true that Apple was founded in a garage by two friends, and Bill Gates dropped out of college to help his buddies found Microsoft. For a few years after the Apple II computer appeared in 1977, an individual could write a commercially viable software program and start a small company to market it. But the greatest advances after the mid-1980s again required the combination of massive government funding and large corporations.

Internet and the World Wide Web

Internet was born in 1969 as ARPAnet, a research network funded by the Advanced Research Projects Agency of the U.S. government that connected computers at the University of California at Los Angeles, the Stanford Research Institute, the University of California at Santa Barbara, and the University of Utah. In 1972 it was first demonstrated to the public, and in the same year it began carrying email. More and more educational institutions, government agencies, and corporations began using the Internet—and finding new uses for it—until by the end of the 1980s it was an essential tool for research and had begun to demonstrate its value for business and personal applications. For example, in 1978 Roy Trubshaw and Richard Bartle invented the first online fantasy game or MUD (Multiple-User Dungeon) at Essex University in England, and in 1989 Alan Cox at the University College of Wales released his own version onto the Internet.

In 1990 at the high-energy physics laboratories of the Conseil Europeen pour la Recherche Nucleaire (CERN) near Geneva, Switzerland, Tim Berners-Lee developed the first hypertext browser and coined the term World Wide Web. Early in 1993, University of Illinois student Marc Andreessen at the National Center for Supercomputing Applications, funded by the U.S. National Science Foundation, programmed the first version of Mosaic, the easy-to-use browser that would introduce millions of people to the Web. Both the Netscape and Microsoft Internet Explorer browsers were based on Mosaic, and it is estimated that more than 10 percent of the world’s population used the Internet in 2002.

The mainframe-timesharing concept of the 1970s has evolved into what is called client-server architecture. A server is a dedicated computer, often large, that houses centralized databases (in companies, universities, or government agencies) or connects directly to the Internet. Originally, clients were dumb terminals with little or no computing power of their own, but today they are powerful personal computers connected to the server and able to access its resources. A very different approach has arisen recently, called peer-to-peer architecture—for example, the music-file-sharing programs like Napster that link personal computers over the Web, in which each computer simultaneously functions as both server and client. The grid computing concept distributes big computation jobs across many widely distributed computers, or distributes data across many archives, eroding the distinction between individual computers and the Internet.

The Era of Ubiquitous Computing

Computers today are found nearly everywhere, embedded in automobiles and grocery store checkout counters, or packaged as pocket-sized personal digital assistants that allow a user to send email or surf the Web from almost anywhere. They have begun to take over the roles of traditional devices such as telephones and televisions, while other devices have become accessories to computers, notably cameras and music players. Old forms of computing do not die, but expand. Children’s toys now have vastly greater computing power than ENIAC, but ENIAC’s direct descendents are supercomputers capable of doing dozens of trillions of calculations per second.

Computer science continues to advance, and nanotechnology promises to sustain Moore’s Law until perhaps about 2025, halting only after the smallest electronic components have shrunk to the size of a single molecule. Two decades of doubling every eighteen months means improvement by a factor of 8,000. That would imply the computing power of today’s desktop computer packaged in a shirt button and costing a dime. What will people do with such power?

In 2003, the Interagency Working Group on Information Technology Research and Development of the U.S. government identified the following “grand challenges” that computing could address in the following decade:

  • Knowledge environments for science and engineering
  • Clean energy production through improved combustion
  • High confidence infrastructure control systems
  • Improved patient safety and health quality
  • Informed strategic planning for long-term regional climate change
  • Nanoscale science and technology: explore and exploit the behavior of ensembles of atoms and molecules
  • Predicting pathways and health effects of pollutants
  • Real-time detection, assessment, and response to natural or man-made threats
  • Safer, more secure, more efficient, higher-capacity, multimodal transportation system
  • Anticipate consequences of universal participation in a digital society
  • Collaborative intelligence: integrating humans with intelligent technologies
  • Generating insights from information at your fingertips;
  • Managing knowledge-intensive dynamic systems
  • Rapidly acquiring proficiency in natural languages
  • SimUniverse [educational computer simulations]: learning by exploring
  • Virtual lifetime tutor for all

Bibliography:

  1. Austrian, G. D. (1982). Herman Hollerith: Forgotten giant of information processing. New York: Columbia University Press.
  2. Bainbridge, W. S. (Ed.). (2004). Berkshire encyclopedia of human-computer interaction. Great Barrington, MA: Berkshire Publishing Group.
  3. Berners-Lee, T., & Fischetti, M. (1999). Weaving the Web. New York: HarperCollins.
  4. Freiberger, P., & Swaine, M. (1999). Fire in the valley: The making of the personal computer (2nd. ed.). New York: McGraw-Hill.
  5. Gibson, W., & Sterling, B. (1991). The difference engine. New York: Bantam.
  6. Gillies, J., & Cailliau, R. (2000). How the Web was born. Oxford, U.K.: Oxford University Press.
  7. Grudin, J. (2004). History of human-computer interaction. In W. S. Bainbridge (Ed.), Berkshire Encyclopedia of human-computer interaction. Great Barrington, MA: Berkshire Publishing Group.
  8. Interagency Working Group on Information Technology Research and Development. (2003). Grand challenges: Science, engineering, and societal advances requiring networking and information technology research and development. Arlington, Virginia: National Coordination Office for Information Technology Research and Development.
  9. Lavendel, G. (1980). A decade of research: Xerox Palo Alto Research Center. New York: Bowker.
  10. Metropolis, N., Howlett, J., & Rota, G.-C. (Eds.). (1980). A history of computing in the twentieth century. New York: Academic Press.
  11. Mollenhoff, C. R. (1988). Atanasoff: Forgotten father of the computer. Ames: Iowa State University Press.
  12. National Research Council. (1999). Funding a revolution: Government support for computing research. Washington, DC: National Academy Press.
  13. Price, D. J. S. de. (1959). An ancient Greek computer. Scientific American 200(6), 60 –67.
  14. Stern, N. (1981). From ENIAC to UNIVAC: An appraisal of the Eckert-Mauchly computers. Bedford, MA: Digital Press.
  15. Swade, D. (2000). The difference engine: Charles Babbage and the quest to build the first computer. New York: Viking.
  16. Waldrop, M. M. (2001). The dream machine: J. C. R. Licklider and the revolution that made computing personal. New York: Viking.
Columbian Exchange Research Paper
Consumerism Research Paper

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get 10% off with the 24START discount code!