Electronics Research Paper Topics

Academic Writing Service

This list of electronics research paper topics provides the list of 30 potential topics for research papers and an overview article on the history of electronics.

1. Applications of Superconductivity

The 1986 Applied Superconductivity Conference proclaimed, ‘‘Applied superconductivity has come of age.’’ The claim reflected only 25 years of development, but was justifiable due to significant worldwide interest and investment. For example, the 1976 annual budget for superconducting systems exceeded $30 million in the U.S., with similar efforts in Europe and Japan. By 1986 the technology had matured impressively into applications for the energy industry, the military, transportation, high-energy physics, electronics, and medicine. The announcement of high-temperature superconductivity just two months later brought about a new round of dramatic developments.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


2. Discovery of Superconductivity

As the twenty-first century began, an array of superconducting applications in high-speed electronics, medical imaging, levitated transportation, and electric power systems are either having, or will soon have, an impact on the daily life of millions. Surprisingly, at the beginning of the twentieth century, the discovery of superconductivity was completely unanticipated and unimagined.

In 1911, three years after liquefying helium, H. Kammerlingh Onnes of the University of Leiden discovered superconductivity while investigating the temperature-dependent resistance of metals below 4.2Kelvin. Later reporting on experiments conducted in 1911, he described the disappearance of the resistance of mercury, stating, ‘‘Within some hundredths of a degree came a sudden fall, not foreseen [by existing theories of resistance]. Mercury has passed into a new state, which . . . may be called the superconductive state.’’




3. Electric Motors

The main types of electric motors that drove twentieth century technology were developed toward the end of the nineteenth century, with direct current (DC) motors being introduced before alternating current (AC) ones. Most important initially was the ‘‘series’’ DC motor, used in electric trolleys and trains from the 1880s onward. The series motor exerts maximum torque on starting and then accelerates to its full running speed, the ideal characteristic for traction work. Where speed control independent of the load is required in such applications as crane and lift drives, the ‘‘shunt’’ DC motor is more suitable.

4. Electronic Calculators

The electronic calculator is usually inexpensive and pocket-sized, using solar cells for its power and having a gray liquid crystal display (LCD) to show the numbers. Depending on the sophistication, the calculator might simply perform the basic mathematical functions (addition, subtraction, multiplication, division) or might include scientific functions (square, log, trig). For a slightly higher cost, the calculator will probably include programmable scientific and business functions. At the end of the twentieth century, the electronic calculator was as commonplace as a screwdriver and helped people deal with all types of mathematics on an everyday basis. Its birth and growth were early steps on the road to today’s world of computing.

5. Electronic Communications

The broad use of digital electronic message communications in most societies by the end of the 20th century can be attributed to a myriad of reasons. Diffusion was incremental and evolutionary. Digital communication technology was seeded by large-scale funding for military projects that broke technological ground, however social needs and use drove systems in unexpected ways and made it popular because these needs were embraced. Key technological developments happened long before diffusion into society, and it was only after popularity of the personal computer that global and widespread use became commonplace. The Internet was an important medium in this regard, however the popular uses of it were well established long before its success. Collaborative developments with open, mutually agreed standards were key factors in broader diffusion of the low-level transmission of digital data, and provided resistance to technological lock-in by any commercial player. By the twenty-first century, the concept of interpersonal electronic messaging was accepted as normal and taken for granted by millions around the world, where infrastructural and political freedoms permitted. As a result, traditional lines of information control and mass broadcasting were challenged, although it remains to be seen what, if any, long-term impact this will have on society.

6. Electronic Control Technology

The advancement of electrical engineering in the twentieth century made a fundamental change in control technology. New electronic devices including vacuum tubes (valves) and transistors were used to replace electromechanical elements in conventional controllers and to develop new types of controllers. In these practices, engineers discovered basic principles of control theory that could be further applied to design electronic control systems.

7. Fax Machine

Fax technology was especially useful for international commercial communication, which was traditionally the realm of the Telex machine, which only relayed Western alpha-numeric content. A fax machine could transmit a page of information regardless of what information it contained, and this led to rapid and widespread adoption in developing Asian countries during the 1980s. With the proliferation of the Internet and electronic e-mail in the last decade of the twentieth century, fax technology became less used for correspondence. At the close of the 20th century, the fax machine was still widely used internationally for the transmission of documents of all forms, with the ‘‘hard copy’’ aspect giving many a sense of permanence that other electronic communication lacked.

8. Hall Effect Devices

The ‘‘Hall effect,’’ discovered in 1879 by American physicist Edwin H. Hall, is the electrical potential produced when a magnetic field is perpendicular to a conductor or semiconductor that is carrying current. This potential is a product of the buildup of charges in that conductor. The magnetic field makes a transverse force on the charge carriers, resulting in the charge being moved to one of the sides of the conductor. Between the sides of the conductor, measurable voltage is yielded from the interaction and balancing of the polarized charge and the magnetic influence.

Hall effect devices are commonly used as magnetic field sensors, or alternatively if a known magnetic field is applied, the sensor can be used to measure the current in a conductor, without actually plugging into it (‘‘contactless potentiometers’’). Hall sensors can also be used as magnetically controlled switches, and as a contactless method of detecting rotation and position, sensing ferrous objects.

9. Infrared Detectors

Infrared detectors rely on the change of a physical characteristic to sense illumination by infrared radiation (i.e., radiation having a wavelength longer than that of visible light). The origins of such detectors lie in the nineteenth century, although their development, variety and applications exploded during the twentieth century. William Herschel (c. 1800) employed a thermometer to detect this ‘‘radiant heat’’; Macedonio Melloni, (c. 1850) invented the ‘‘thermochrose’’ to display spatial differences of irradiation as color patterns on a temperature-sensitive surface; and in 1882 William Abney found that photographic film could be sensitized to respond to wavelengths beyond the red end of the spectrum. Most infrared detectors, however, convert infrared radiation into an electrical signal via a variety of physical effects. Here, too, 19th century innovations continued in use well into the 21st century.

10. Integrated Circuits Design and Use

Integrated circuits (ICs) are electronic devices designed to integrate a large number of microscopic electronic components, normally connected by wires in circuits, within the same substrate material. According to the American engineer Jack S. Kilby, they are the realization of the so-called ‘‘monolithic idea’’: building an entire circuit out of silicon or germanium. ICs are made out of these materials because of their properties as semiconductors— materials that have a degree of electrical conductivity between that of a conductor such as metal and that of an insulator (having almost no conductivity at low temperatures). A piece of silicon containing one circuit is called a die or chip. Thus, ICs are known also as microchips. Advances in semiconductor technology in the 1960s (the miniaturization revolution) meant that the number of transistors on a single chip doubled every two years, and led to lowered microprocessor costs and the introduction of consumer products such as handheld calculators.

11. Integrated Circuits Fabrication

The fabrication of integrated circuits (ICs) is a complicated process that consists primarily of the transfer of a circuit design onto a piece of silicon (the silicon wafer). Using a photolithographic technique, the areas of the silicon wafer to be imprinted with electric circuitry are covered with glass plates (photomasks), irradiated with ultraviolet light, and treated with chemicals in order to shape a circuit’s pattern. On the whole, IC manufacture consists of four main stages:

  1. Preparation of a design
  2. Preparation of photomasks and silicon wafers
  3. Production
  4. Testing and packaging

Preparing an IC design consists of drafting the circuit’s electronic functions within the silicon board. This process has radically changed over the years due to the increasing complexity of design and the number of electronic components contained within the same IC. For example, in 1971, the Intel 4004 microprocessor was designed by just three engineers, while in the 1990s the Intel Pentium was designed by a team of 100 engineers. Moreover, the early designs were produced with traditional drafting techniques, while from the late 1970s onward the introduction of computer-aided design (CAD) techniques completely changed the design stage. Computers are used to check the design and simulate the operations of perspective ICs in order to optimize their performance. Thus, the IC drafted design can be modified up to 400 times before going into production.

12. Josephson Junction Devices

One of the most important implications of quantum physics is the existence of so-called tunneling phenomena in which elementary particles are able to cross an energy barrier on subatomic scales that it would not be possible for them to traverse were they subject to the laws of classical mechanics. In 1973 the Nobel Prize in Physics was awarded to Brian Josephson, Ivan Giaever and Leo Esaki for their work in this field. Josephson’s contribution consisted of a number of important theoretical predictions made while a doctoral student at Cambridge University. His work was confirmed experimentally within a year of its publication in 1961, and practical applications were commercialized within ten years.

13. Laser Applications

Lasers are employed in virtually every sector of the modern world including industry, commerce, transportation, medicine, education, science, and in many consumer devices such as CD players and laser printers. The intensity of lasers makes them ideal cutting tools since their highly focused beam cuts more accurately than machined instruments and leaves surrounding materials unaffected. Surgeons, for example, have employed carbon dioxide or argon lasers in soft tissue surgery since the early 1970s. These lasers produce infrared wavelengths of energy that are absorbed by water. Water in tissues is rapidly heated and vaporized, resulting in disintegration of the tissue. Visible wavelengths (argon ion laser) coagulate tissue. Far-ultraviolet wavelengths (higher photon energy, as produced by excimer lasers) break down molecular bonds in target tissue and ‘‘ablate’’ tissue without heating. Excimer lasers have been used in corneal surgery since 1984. Short pulses only affect the surface area of interest and not deeper tissues. The extremely small size of the beam, coupled with optical fibers, enables today’s surgeons to conduct surgery deep inside the human body often without a single cut on the exterior. Blue lasers, developed in 1994 by Shuji Nakamura of Nichia Chemical Industries of Japan, promise even more precision than the dominant red lasers currently used and will further revolutionize surgical cutting techniques.

14. Laser Theory and Operation

Lasers (an acronym for light amplification by stimulated emission of radiation) provide intense, focused beams of light whose unique properties enable them to be employed in a wide range of applications in the modern world. The key idea underlying lasers originated with Albert Einstein who published a paper in 1916 on Planck’s distribution law, within which he described what happens when additional energy is introduced into an atom. Atoms have a heavy and positively charged nucleus surrounded by groups of extremely light and negatively charged electrons. Electrons orbit the atom in a series of ‘‘fixed’’ levels based upon the degree of electromagnetic attraction between each single electron and the nucleus. Various orbital levels also represent different energy levels. Normally electrons remain as close to the nucleus as their energy level permits, with the consequence that an atom’s overall energy level is minimized. Einstein realized that when energy is introduced to an atom; for example, through an atomic collision or through electrical stimulation, one or more electrons become excited and move to a higher energy level. This condition exists temporarily before the electron returns to its former energy level. When this decay phenomenon occurs, a photon of light is emitted. Einstein understood that since the energy transitions within the atom are always identical, the energy and the wavelength of the stimulated photon of light are also predictable; that is, a specific type of transition within an atom will yield a photon of light of a specific wavelength. Hendrick Kramers and Werner Heisenberg obtained a series of more extensive calculations of the effects of these stimulated emissions over the next decade. The first empirical evidence supporting these theoretical calculations occurred between 1926 and 1930 in a series of experiments involving electrical discharges in neon.

15. Lasers in Optoelectronics

Optoelectronics, the field combining optics and electronics, is dependent on semiconductor (diode) lasers for its existence. Mass use of semiconductor lasers has emerged with the advent of CD and DVD technologies, but it is the telecommunications sector that has primarily driven the development of lasers for optoelectronic systems. Lasers are used to transmit voice, data, or video signals down fiber-optic cables.

While the success of lasers within telecommunication systems seems unquestioned thanks to their utility in long-distance large-capacity, point-to-point links, these lasers also find use in many other applications and are ubiquitous in the developed world. Their small physical size, low power operation, ease of modulation (via simple input current variation) and small beam size mean that these lasers are now part of our everyday world, from CDs and DVDs, to supermarket checkouts and cosmetic medicine.

16. Light Emitting Diodes

Light emitting diodes, or LEDs, are semiconductor devices that emit monochromatic light once an electric current passes through it. The color of light emitted from LEDs depends not on the color of the bulb, but on the emission’s wavelength. Typically made of inorganic materials like gallium or silicon, LEDs have found frequent use as ‘‘pilot,’’ or indicator, lights for electronic devices. Unlike incandescent light bulbs, which generate light from ‘‘heat glow,’’ LEDs create light more efficiently and are generally more durable than traditional light sources.

17. Lighting Techniques

In 1900 electric lighting in the home was a rarity. Carbon filament incandescent lamps had been around for 20 years, but few households had electricity. Arc lamps were used in streets and large buildings such as railway stations. Domestic lighting was by candle, oil and gas.

The stages of the lightning techniques evolution are the following:

  1. Non-Electric Lighting
  2. Electric Lighting: Filament Lamps
  3. Electric Lighting: Discharge Lamps
  4. Electric Lighting: Fluorescent Lamps
  5. Electric Lighting: LED Lamps

18. Mechanical and Electromechanical Calculators

The widespread use of calculating devices in the twentieth century is intimately linked to the rise of large corporations and to the increasing role of mathematical calculation in science and engineering. In the business setting, calculators were used to efficiently process financial information. In science and engineering, calculators speeded up routine calculations. The manufacture and sale of calculators was a widespread industry, with major firms in most industrialized nations. However, the manufacture of mechanical calculators declined very rapidly in the 1970s with the introduction of electronic calculators, and firms either diversified into other product lines or went out of business. By the end of the twentieth century, slide rules, adding machines, and other mechanical calculators were no longer being manufactured.

19. Mobile (Cell) Telephones

In the last two decades of the twentieth century, mobile or cell phones developed from a minority communication tool, characterized by its prevalence in the 1980s among young professionals, to a pervasive cultural object. In many developed countries, more than three quarters of the population owned a cell phone by the end of the 20th century.

Cell phone technology is a highly evolved form of the personal radio systems used by truck drivers (citizens band, or CB, radio) and police forces in which receiver/transmitter units communicate with one another or a base antenna. Such systems work adequately over short distances with a low volume of traffic but cannot be expanded to cope with mass communication due to the limited space (bandwidth) available in the electromagnetic spectrum. Transmitting and receiving on one frequency, they allow for talking or listening but not both simultaneously.

For mobile radio systems to make the step up to effective telephony, a large number of two-way conversations needed to be accommodated, requiring a duplex channel (two separate frequencies, taking up double the bandwidth). In order to establish national mobile phone networks without limiting capacity or the range of travel of handsets, a number of technological improvements had to occur.

20. Photocopiers

The photocopier, copier, or copying machine, as it is variously known, is a staple of modern life. Copies by the billions are produced not only in the office but also on machines available to the public in libraries, copy shops, stationery stores, supermarkets, and a wide variety of other commercial facilities. Modern xerographic copiers, produced by a number of manufacturers, are available as desktop models suitable for the home as well as the small office. Many modern copiers reproduce in color as well as black and white, and office models can rival printing presses in speed of operation.

21. Photosensitive Detectors

Sensing radiation from ultraviolet to optical wavelengths and beyond is an important part of many devices. Whether analyzing the emission of radiation, chemical solutions, detecting lidar signals, fiber-optic communication systems, or imaging of medical ionizing radiation, detectors are the final link in any optoelectronic experiment or process.

Detectors fall into two groups: thermal detectors (where radiation is absorbed and the resulting temperature change is used to generate an electrical output) and photon (quantum) detectors. The operation of photon detectors is based on the photoelectric effect, in which the radiation is absorbed within a metal or semiconductor by direct interaction with electrons, which are excited to a higher energy level. Under the effect of an electric field these carriers move and produce a measurable electric current. The photon detectors show a selective wavelength-dependent response per unit incident radiation power.

22. Public and Private Lighting

At the turn of the 20th century, lighting was in a state of flux. In technical terms, a number of emerging lighting technologies jostled for economic dominance. In social terms, changing standards of illumination began to transform cities, the workplace, and the home. In design terms, the study of illumination as a science, as an engineering profession, and as an applied art was becoming firmly established. In the last decades of the 20th century, the technological and social choices in lighting attained considerable stability both technically and socially. Newer forms of compact fluorescent lighting, despite their greater efficiency, have not significantly replaced incandescent bulbs in homes owing to higher initial cost. Low-pressure sodium lamps, on the other hand, have been adopted increasingly for street and architectural lighting owing to lower replacement and maintenance costs. As with fluorescent lighting in the 1950s, recent lighting technologies have found niche markets rather than displacing incandescents, which have now been the dominant lighting system for well over a century.

23. Quantum Electronic Devices

Quantum theory, developed during the 1920s to explain the behavior of atoms and the absorption and emission of light, is thought to apply to every kind of physical system, from individual elementary particles to macroscopic systems such as lasers. In lasers, stimulated transitions between discrete or quantized energy levels is a quantum electronic phenomena (discussed in the entry Lasers, Theory and Operation). Stimulated transitions are also the central phenomena in atomic clocks. Semiconductor devices such as the transistor also rely on the arrangement of quantum energy levels into a valence band and a conduction band separated by an energy gap, but advanced quantum semiconductor devices were not possible until advances in fabrication techniques such as molecular beam epitaxy (MBE) developed in the 1960s made it possible to grow extremely pure single crystal semiconductor structures one atomic layer at a time.

In most electronic devices and integrated circuits, quantum phenomena such as quantum tunneling and electron diffraction—where electrons behave not as particles but as waves—are of no significance, since the device is much larger than the wavelength of the electron (around 100 nanometers, where one nanometer is 109 meters or about 4 atoms wide). Since the early 1980s however, researchers have been aware that as the overall device size of field effect transistors decreased, small-scale quantum mechanical effects between components, plus the limitations of materials and fabrication techniques, would sooner or later inhibit further reduction in the size of conventional semiconductor transistors. Thus to produce devices on ever-smaller integrated circuits (down to 25 nanometers in length), conventional microelectronic devices would have to be replaced with new device concepts that take advantage of the quantum mechanical effects that dominate on the nanometer scale, rather than function in despite of them. Such solid state ‘‘nanoelectronics’’ offers the potential for increased speed and density of information processing, but mass fabrication on this small scale presented formidable challenges at the end of the 20th century.

24. Quartz Clocks and Watches

The wristwatch and the domestic clock were completely reinvented with all-new electronic components beginning about 1960. In the new electronic timepieces, a tiny sliver of vibrating quartz in an electrical circuit provides the time base and replaces the traditional mechanical oscillator, the swinging pendulum in the clock or the balance wheel in the watch. Instead of an unwinding spring or a falling weight, batteries power these quartz clocks and watches, and integrated circuits substitute for intricate mechanical gear trains.

25. Radio-Frequency Electronics

Radio was originally conceived as a means for interpersonal communications, either person-toperson, or person-to-people, using analog waveforms containing either Morse code or actual sound. The use of radio frequencies (RF) designed to carry digital data in the form of binary code rather than voice and to replace physical wired connections between devices began in the 1970s, but the technology was not commercialized until the 1990s through digital cellular phone networks known as personal communications services (PCS) and an emerging group of wireless data network technologies just reaching commercial viability. The first of these is a so-called wireless personal area network (WPAN) technology known as Bluetooth. There are also two wireless local area networks (WLANs), generally grouped under the name Wi-Fi (wireless fidelity): (1) Wi-Fi, also known by its Institute of Electrical and Electronic Engineers (IEEE) designation 802.11b, and (2) Wi-Fi5 (802.11a).

26. Rectifiers

Rectifiers are electronic devices that are used to control the flow of current. They do this by having conducting and nonconducting states that depend on the polarity of the applied voltage. A major function in electronics is the conversion from alternating current (AC) to direct current (DC) where the output is only one-half (either positive or negative) of the input. Rectifiers that are currently, or have been, in use include: point-contact diodes, plate rectifiers, thermionic diodes, and semiconductor diodes. There are various ways in which rectifiers may be classified in terms of the signals they encounter; this contribution will consider two extremes—high frequency and heavy current—that make significantly different demands on device design.

27. Strobe Flashes

Scarcely a dozen years after photography was announced to the world in 1839, William Henry Fox Talbot produced the first known flash photograph. Talbot, the new art’s co-inventor, fastened a printed paper onto a disk, set it spinning as fast as possible, and then discharged a spark to expose a glass plate negative. The words on the paper could be read on the photograph. Talbot believed that the potential for combining electric sparks and photography was unlimited. In 1852, he pronounced, ‘‘It is in our power to obtain the pictures of all moving objects, no matter in how rapid motion they may be, provided we have the means of sufficiently illuminating them with a sudden electric flash.’’

The electronic stroboscope fulfills Talbot’s prediction. It is a repeating, short-duration light source used primarily for visual observation and photography of high-speed phenomena. The intensity of the light emitted from strobes also makes them useful as signal lights on communication towers, airport runways, emergency vehicles, and more. Though ‘‘stroboscope’’ actually refers to a repeating flash and ‘‘electronic flash’’ denotes a single burst, both types are commonly called ‘‘strobes.’’

28. Transistors

Early experiments in transistor technology were based on the analogy between the semiconductor and the vacuum tube: the ability to both amplify and effectively switch an electrical signal on or off (rectification). By 1940, Russell Ohl at Bell Telephone Laboratories, among others, had found that impure silicon had both positive (ptype material with holes) and negative (n-type) regions. When a junction is created between n-type material and p-type material, electrons on the ntype side are attracted across the junction to fill holes in the other layer. In this way, the n-type semiconductor becomes positively charged and the p-type becomes negatively charged. Holes move in the opposite direction, thus reinforcing the voltage built up at the junction. The key point is that current flows from one side to the other when a positive voltage is applied to the layers (‘‘forward biased’’).

29. Travelling Wave Tubes

One of the most important devices for the amplification of radio-frequency (RF) signals— which range in frequency from 3 kilohertz to 300 gigahertz—is the traveling wave tube (TWT). When matched with its power supply unit, or electronic power conditioner (EPC), the combination is known as a traveling wave tube amplifier (TWTA). The amplification of RF signals is important in many aspects of science and technology, since the ability to increase the strength of a very low-power input signal is fundamental to all types of long-range communications, radar and electronic warfare.

30. Vacuum Tubes/Valves

The vacuum tube has its roots in the late nineteenth century when Thomas A. Edison conducted experiments with electric bulbs in 1883. Edison’s light bulbs consisted of a conducting filament mounted in a glass bulb. Passing electricity through the filament caused it to heat up and radiate light. A vacuum in the tube prevented the filament from burning up. Edison noted that electric current would flow from the bulb filament to a positively charged metal plate inside the tube. This phenomenon, the one-way flow of current, was called the Edison Effect. Edison himself could not explain the filament’s behavior. He felt this effect was interesting but unimportant and patented it as a matter of course. It was only fifteen years later that Joseph John Thomson, a physics professor at the Cavendish Laboratory at the University of Cambridge in the U.K., discovered the electron and understood the significance of what was occurring in the tube. He identified the filament rays as a stream of particles, now called electrons. In a range of papers from 1901 to 1916, O.W. Richardson explained the electron behavior. Today the Edison Effect is known as thermionic emission.

History of Electronics

ElectronicsElectronic systems in use today perform a remarkably broad range of functions, but they share the technical characteristic of employing electron devices such as vacuum tubes, transistors, or integrated circuits. Most electron devices in use today function as electric switches or valves, controlling a flow of electrons in order to perform useful tasks. Electron devices differ from ordinary electromechanical switches or current- or voltage-control devices in that an applied electric current or field controls electron flow rather than a mechanical device. Electronic devices are ‘‘active,’’ like machines, but have no moving parts, so engineers distinguish them both from electromechanical devices and from other ‘‘passive’’ electrical components such as wires, capacitors, transformers, and resistors. When the word electronics was coined around 1930, it usually referred to so-called vacuum tubes (valves), which utilize electrons flowing through a vacuum. With the advent of the transistor in the late 1940s, a second term emerged to describe this new category of ‘‘solid-state’’ electron devices, which performed some of the same functions as vacuum tubes but consisted of solid blocks of metal.

Few of the basic tasks that electronic technologies perform, such as communication, computation, amplification, or automatic control, are unique to electronics. Most were anticipated by the designers of mechanical or electromechanical technologies in earlier years. What distinguishes electronic communication, computation, and control is often linked to the instantaneous action of the devices, the delicacy of their actions compared to mechanical systems, their high reliability, or their tiny size.

The electronics systems introduced between the late nineteenth century and the end of the twentieth century can be roughly divided into the applications related to communications (including telegraphy, telephony, broadcasting, and remote detection) and the more recently developed fields involving digital information and computation. In recent years these two fields have tended to converge, but it is still useful to consider them separately for a discussion of their history.

The origins of electronics as distinguished from other electrical technologies can be traced to 1880 and the work of Thomas Edison. While investigating the phenomenon of the blackening of the inside surface of electric light bulbs, Edison built an experimental bulb that included a third, unused wire in addition to the two wires supporting the filament. When the lamp was operating, Edison detected a flow of electricity from the filament to the third wire, through the evacuated space in the bulb. He was unable to explain the phenomenon, and although he thought it would be useful in telegraphy, he failed to commercialize it. It went unexplained for about 20 years, until the advent of wireless telegraphic transmission by radio waves. John Ambrose Fleming, an experimenter in radio, not only explained the Edison effect but used it to detect radio waves. Fleming’s ‘‘valve’’ as he called it, acted like a one-way valve for electric waves, and could be used in a circuit to convert radio waves to electric pulses so that that incoming Morse code signals could be heard through a sounder or earphone.

As in the case of the Fleming valve, many early electronic devices were used first in the field of communications, mainly to enhance existing forms of technology. Initially, for example, telephony (1870s) and radio (1890s) were accomplished using ordinary electrical and electromechanical circuits, but eventually both were transformed through the use of electronic devices. Many inventors in the late nineteenth century sought a functional telephone ‘‘relay’’; that is, something to refresh a degraded telephone signal to allow long distance telephony. Several people simultaneously recognized the possibility of developing a relay based on the Fleming valve. The American inventor Lee de Forest was one of the first to announce an electronic amplifier using a modified Fleming valve, which he called the Audion. While he initially saw it as a detector and amplifier of radio waves, its successful commercialization occurred first in the telephone industry. The sound quality and long-distance capability of telephony was enhanced and extended after the introduction of the first electronic amplifier circuits in 1907. In the U.S., where vast geographic distances separated the population, the American Telephone and Telegraph Company (AT&T) introduced improved vacuum tube amplifiers in 1913, which were later used to establish the first coast-to-coast telephone service in 1915 (an overland distance of nearly 5000 kilometers).

These vacuum tubes soon saw many other uses, such as a public-address systems constructed as early as 1920, and radio transmitters and receivers. The convergence of telephony and radio in the form of voice broadcasting was technically possible before the advent of electronics, but its application was greatly enhanced through the use of electronics both in the radio transmitter and in the receiver.

World War I saw the applications of electronics diversify somewhat to include military applications. Mostly, these were modifications of existing telegraph, telephone, and radio systems, but applications such as ground-to-air radio telephony were novel. The pressing need for large numbers of electronic components, especially vacuum tubes suitable for military use, stimulated changes in their design and manufacture and contributed to improving quality and falling prices. After the war, the expanded capacity of the vacuum tube industry contributed to a boom in low-cost consumer radio receivers. Yet because of the withdrawal of the military stimulus and the onset of the Great Depression, the pace of change slowed in the 1930s. One notable exception was in the field of television. Radio broadcasting became such a phenomenal commercial success that engineers and businessmen were envisioning how ‘‘pictures with sound’’ would replace ordinary broadcasting, even in the early 1930s. Germany, Great Britain, and the U.S. all had rudimentary television systems in place by 1939, although World War II would bring nearly a complete halt to these early TV broadcasts.

World War II saw another period of rapid change, this one much more dramatic than that of World War I. Not only were radio communications systems again greatly improved, but for the first time the field of electronics engineering came to encompass much more than communication. While it was the atomic bomb that is most commonly cited as the major technological outcome of World War II, radar should probably be called the weapon that won the war. To describe radar as a weapon is somewhat inaccurate, but there is no doubt that it had profound effects upon the way that naval, aerial, and ground combat was conducted. Using radio waves as a sort of searchlight, radar could act as an artificial eye capable of seeing through clouds or fog, over the horizon, or in the dark. Furthermore, it substituted for existing methods of calculating the distance and speed of targets. Radar’s success hinged on the development of new electronic components, particularly new kinds of vacuum tubes such as the klystron and magnetron, which were oriented toward the generation of microwaves. Subsidized by military agencies on both sides of the Atlantic (as well as Japan) during World War II, radar sets were eventually installed in aircraft and ships, used in ground stations, and even built into artillery shells. The remarkable engineering effort that was launched to make radar systems smaller, more energy efficient, and more reliable would mark the beginning of an international research program in electronics miniaturization that continues today. Radar technology also had many unexpected applications elsewhere, such as the use of microwave beams as a substitute for long-distance telephone cables. Microwave communication is also used extensively today for satellite-to-earth communication.

The second major outcome of electronics research during World War II was the effort to build an electronic computer. Mechanical adders and calculators were widely used in science, business, and government by the early twentieth century, and had reached an advanced state of design. Yet the problems peculiar to wartime, especially the rapid calculation of mountains of ballistics data, drove engineers to look for ways to speed up the machines. At the same time, some sought a calculator that could be reprogrammed as computational needs changed. While computers played a role in the war, it was not until the postwar period that they came into their own. In addition, computer research during World War II contributed little to the development of vacuum tubes, although in later years computer research would drive certain areas of semiconductor electron device research.

While the forces of the free market are not to be discounted, the role of the military in electronics development during World War II was of paramount importance. More-or-less continuous military support for research in electronic devices and systems persisted during the second half of the twentieth century too, and many more new technologies emerged from this effort. The sustained effort to develop more compact, rugged devices such as those demanded by military systems would converge with computer development during the 1950s, especially after the invention of the transistor in late 1947.

The transistor was not a product of the war, and in fact its development started in the 1930s and was delayed by the war effort. A transistor is simply a very small substitute for a vacuum tube, but beyond that it is an almost entirely new sort of device. At the time of its invention, its energy efficiency, reliability, and diminutive size suggested new possibilities for electronic systems. The most famous of these possibilities was related to computers and systems derived from or related to computers, such as robotics or industrial automation. The impetus for the transistor was a desire within the telephone industry to create an energy-efficient, reliable substitute for the vacuum tube. Once introduced, the military pressed hard to accelerate its development, as the need emerged for improved electronic navigational devices for aircraft and missiles.

There were many unanticipated results of the substitution of transistors for vacuum tubes. Because they were so energy efficient, transistors made it much more practical to design battery powered systems. The small transistor radio (known in some countries simply as ‘‘the transistor’’), introduced in the 1950s, is credited with helping to popularize rock and roll music. It is also worth noting that many developing countries could not easily provide broadcasting services until the diffusion of battery operated transistor receivers because of the lack of central station electric power. The use of the transistor also allowed designers to enhance existing automotive radios and tape players, contributing eventually to a greatly expanded culture of in-car listening. There were other important outcomes as well; transistor manufacture provided access to the global electronics market for Asian radio manufacturers, who improved manufacturing methods to undercut their U.S. competitors during the 1950s and 1960s. Further, the transistor’s high reliability nearly eliminated the profession of television and radio repair, which had supported tens of thousands of technicians in the U.S. alone before about 1980.

However, for all its remarkable features, the transistor also had its limitations; while it was an essential part of nearly every cutting-edge technology of the postwar period, it was easily outperformed by the older technology of vacuum tubes in some areas. The high-power microwave transmitting devices in communications satellites and spacecraft, for example, nearly all relied on special vacuum tubes through the end of the twentieth century, because of the physical limitations of semiconductor devices. For the most part, however, the transistor made the vacuum tube obsolete by about 1960.

The attention paid to the transistor in the 1950s and 1960s made the phrase ‘‘solid-state’’ familiar to the general public, and the new device spawned many new companies. However, its overall impact pales in comparison to its successor—the integrated circuit. Integrated circuits emerged in the late 1950s, were immediately adopted by the military for small computer and communications systems, and were then used in civilian computers and related applications from the 1960s. Integrated circuits consist of multiple transistors fabricated simultaneously from layers of semiconductor and other materials. The transistors, interconnecting ‘‘wires,’’ and many of the necessary circuit elements such as capacitors and resistors are fabricated on the ‘‘chip.’’ Such a circuit eliminates much of the laborious process of assembling an electronic system such as a computer by hand, and results in a much smaller product. The ability to miniaturize components through integrated circuit fabrication techniques would lead to circuits so vanishingly small that it became difficult to connect them to the systems of which they were a part. The plastic housings or ‘‘packages’’ containing today’s microprocessor chips measure just a few centimeters on a side, and yet the actual circuits inside are much smaller. Some of the most complex chips made today contain many millions of transistors, plus millions more solid-state resistors and other passive components.

While used extensively in military and aerospace applications, the integrated circuit became famous as a component in computer systems. The logic and memory circuits of digital computers, which have been the focus of much research, consist mainly of switching devices. Computers were first constructed in the 1930s with electromechanical relays as switching devices, then with vacuum tubes, transistors, and finally integrated circuits. Most early computers used off-the-shelf tubes and transistors, but with the advent of the integrated circuit, designers began to call for components designed especially for computers. It was clear to engineers at the time that all the circuits necessary to build a computer could be placed on one chip (or a small set of chips), and in fact, the desire to create a ‘‘computer on a chip’’ led to the microprocessor, introduced around 1970. The commercial impetus underlying later generations of computer chip design was not simply miniaturization (although there are important exceptions) or energy efficiency, but also the speed of operation, reliability, and lower cost. However, the inherent energy efficiency and small size of the resulting systems did enable the construction of smaller computers, and the incorporation of programmable controllers (special purpose computers) into a wide variety of other technologies. The recent merging of the computer (or computer-like systems) with so many other technologies makes it difficult to summarize the current status of digital electronic systems. As the twentieth century drew to a close, computer chips were widely in use in communications and entertainment devices, in industrial robots, in automobiles, in household appliances, in telephone calling cards, in traffic signals, and in a myriad other places. The rapid evolution of the computer during the last 50 years of the twentieth century was reflected by the near-meaninglessness of its name, which no longer adequately described its functions.

From an engineering perspective, not only did electronics begin to inhabit, in an almost symbiotic fashion, other technological systems after about 1950, but these electronics systems were increasingly dominated by the use of semiconductor technology. After virtually supplanting the vacuum tube in the 1950s, the semiconductor-based transistor became the technology of choice for most subsequent electronics development projects. Yet semiconducting alloys and compounds proved remarkably versatile in applications at first unrelated to transistors and chips. The laser, for example, was originally operated in a large vacuum chamber and depended on ionized gas for its operation. By the 1960s, laser research was focused on the remarkable ability of certain semiconducting materials to accomplish the same task as the ion chamber version. Today semiconductor devices are used not only as the basis of amplifiers and switches, but also for sensing light, heat, and pressure, for emitting light (as in lasers or video displays), for generating electricity (as in solar cells), and even for mechanical motion (as in micromechanical systems or MEMS).

However, semiconductor devices in ‘‘discrete’’ forms such as transistors, would probably not have had the remarkable impact of the integrated circuit. By the 1970s, when the manufacturing techniques for integrated circuits allowed high volume production, low cost, tiny size, relatively small energy needs, and enormous complexity; electronics entered a new phase of its history, having a chief characteristic of allowing electronic systems to be retrofitted into existing technologies. Low-cost microprocessors, for example, which were available from the late 1970s onward, were used to sense data from their environment, measure it, and use it to control various technological systems from coffee machines to video tape recorders. Even the human body is increasingly invaded by electronics; at the end of the twentieth century, several researchers announced the first microchips for implantation directly in the body. They were to be used to store information for retrieval by external sensors or to help deliver subcutaneous drugs. The integrated circuit has thus become part of innumerable technological and biological systems.

It is this remarkable flexibility of application that enabled designers of electronic systems to make electronics the defining technology of the late twentieth century, eclipsing both the mechanical technologies associated with the industrial revolution and the electrical and information technologies of the so-called second industrial revolution. While many in the post-World War II era once referred to an ‘‘atomic age,’’ it was in fact an era in which daily life was increasingly dominated by electronics.

Browse other Technology Research Paper Topics.

Construction Technology Research Paper Topics
Energy and Power Technology Research Paper Topics

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get 10% off with the 24START discount code!