Technology And Social Control Research Paper

Academic Writing Service

View sample Technology And Social Control Research Paper. Browse other  research paper examples and check the list of research paper topics for more inspiration. If you need a religion research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our custom writing service for professional assistance. We offer high-quality assignments for reasonable rates.

The last half of the twentieth century has seen a significant increase in the use of science and technology for purposes of social control. Examples include video and audio surveillance, heat, light, motion, sound and olfactory sensors, electronic tagging of consumer items, animals and humans, biometric access codes, drug testing, DNA analysis, and the use of computer techniques such as expert systems, matching and profiling, data mining, mapping, network analysis, and simulation. Control technologies have become available that previously existed only in the dystopic imaginations of science fiction writers. Many technologies developed for the military such as spy and global positioning satellites and night vision scopes have become commercially available. Six technology-based strategies of control and some social and ethical issues they raise are discussed.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


As used here social control refers to efforts to enforce norms by preventing violations or discovering and apprehending violators, rather than to other aspects of social control such as the creation of norms, processes of adjudication and sanctioning, or the broad societal guidance and integration which was of concern to early theorists of industrialization and urbanization. The engineering of social control can be differentiated from related control forms such as the creation and manipulation of culture and socialization, the redistributive rewards of the welfare state and interpersonal influences.

The increased prominence of social control via engineering is related to concerns over issues such as crime, terrorism, drug abuse, border controls, AIDS, and economic competitiveness; and to technical developments in electronics, computerization, artificial intelligence, biochemistry, architecture, and materials science. The scale, mobility, and anonymity of mass society and ironically, increased expectations of and protections for privacy, have furthered reliance on external, impersonal, distance-mediated, secondary technical means, and database memories that locate, identify, register, record, classify, and validate or generate grounds for suspicion. The perception of catastrophic risks in an interdependent world relying on complex technologies and the entrepreneurial efforts of the security industry and governments such as the United States with its war on drugs, have helped spread the technologies internationally.




There is a magisterial, legitimacy-granting aura around both law and science (Ericson and Shearing 1986). Too often they serve as their own justification divorced from broader critical questions. Technological controls, presumably being science-based, are justified as valid, objective, neutral, universal, consensual, and fair. This view tends to overlook the fact that results are socially interpreted (and thus potentially disputable) and it overlooks the personal interests of control agents and the sectarian, socially constructed interests agents may represent. This legitimacy is strengthened in free-market societies where the tactics can often be used by citizens (e.g., video cameras to record police behavior or DNA analysis offered by a criminal defendant) and internally by police managers for guarding the guards.

Of course the inventors and builders of the first locks, safes, moats, and castles and the developers of early biometric identification systems (e.g., the Italian criminologist Cesare Lombroso, 1835–1909) were engaged in the engineering of social control. What are new are the scale and relatively greater scientific precision, continual invention and experimentation, and rapid global diffusion. Technical means of control saturate modern society, colonizing and documenting ever more areas of life.

The roots of contemporary social control lie in the development of large organizations and standardized control technologies (Beniger 1986). They are one strand of broad processes of rationalization, professionalization, and specialization occuring with modernization (Weber 1958, Rule 1973, Foucault 1977, Cohen 1985, Laudon 1986, Gandy 1993, Zuboff 1988, Lyon 1994, Shenhav 1999).

The ratio of machines as protectors, monitors, and controllers relative to humans continues to increase. Control has become softer and less visible partly because it is built-in (e.g., software that sends an Internet message or shuts down if it is misused or that monitors work—such as the number of keystrokes entered or the driving behavior of truckers), and partly because of more sophisticated uses of deception (complex undercover operations and disguised video cameras). Much contemporary control is better symbolized by manipulation than coercion, by computer chips than prison bars, and by remote and invisible tethers and filters than by handcuffs, straitjackets, and physical walls. Being more covert, embedded, and remote, it is often involuntary, occurring without the awareness or consent of its subject. Controllers are increasingly able to know things about subjects that the latter do not know about themselves, and to make decisions affecting their life chances of which they are unaware.

Contemporary social control has become more extensive and casts a much wider net than at mid-twentieth century, continuing an expansionary trend that began with industrialization and the rise of the nation-state. Control involves ever more integrated information-sharing networks, blurring many traditional institutional and organizational borders (e.g., within and between agencies and levels of government, banks, insurance, healthcare, education, work, telecommunications, and sales and marketing organizations). This data sharing is taken further in the United States than in Canada or Europe.

Technical controls are capitalrather than laborintensive and hence the cost of control per unit of information has decreased. More objects and areas are subjected to inspection and control and there is a broadening from the traditional targeting of a specific suspect to categorical suspicion (e.g., the computer search, video camera, or metal detector capture information on all those within their province) and from Individuals to networks and organizations.

Control has also become more intensive, probing deeply beneath protective surfaces. Many contemporary controls transcend boundaries of distance, darkness, physical barriers, and time—factors that traditionally protected liberty, as well as malfeasance. Data can be easily stored, retrieved, combined (from different places and in different forms such as visual, auditory, print, and numerical), analyzed, and communicated. Control may be remote and deterritorialized, with buffers between controllers and those controlled. Control and knowledge of others’ behavior are no longer restricted to what the senses directly reveal through interaction or to a bounded physical place.

Contemporary social control themes are seen in their most extreme form in the maximum security prison (e.g., continuous surveillance, extensive use of computer dossiers, decisions about individuals based on actuarial data, the engineering of control). However, with the spread of such controls throughout the society, we may ask if we are not moving toward becoming a maximum security society (Marx 1988).

The surveillance society’s more omniscient and omnipresent technical social control is seen in a broad variety of rule enforcement contexts beyond public policing or prisons. The original eighteenth-century French notion of an all-knowing, absorbent political police (Brodeur 1983) to protect the state has become generalized across institutions and applied by new users and for new goals.

A reliance on technology for social control is a hallmark of private police who, in the United States (although not in Europe), far outnumber public police. It is also seen in the workplace, the marketplace, schools, and even among friends and family members. In the case of children, for example, there are at home video and audio room monitors, geographic location devices for tracking teenagers in cars, drug tests, beepers, records of phone and computer use, and internet content filters that censor what can be accessed. Consider also ‘intelligent highway systems’ or the ‘smart homes’ that are appearing on the market in which all data flows into, and out of, the home are part of an integrated, continuously monitored system.

In an engineered society, the goal is to eliminate or limit violations by control of the physical and social environment. As in other areas of social intervention such as public health, there is a strong emphasis on prevention. Ideally problems are anticipated and simply designed away (Sects. 1 and 4), or where that is not possible, the goal is to create deterrence by reducing the gain or increasing the likelihood of identification and apprehension (Sects. 2, 3, 5, and 6) (Marx 1995).

1. Six Social Engineering Strategies

1.1 Target Removal

The logic of prevention is clearest and most effective here. Something that is not there can not be taken. The move toward a cashless society is one example. Merchants who only accept credit or debit cards, or whose registers never have more than a modest amount of cash are unlikely to be conventionally robbed. Furniture built into the wall cannot be stolen. Subway and bus exteriors built with graffiti-resistant metals are hard to draw upon. Through software programming, computers and telephones can be blocked from sending or receiving messages to, or from, selected locations.

1.2 Target Devaluation

Here the goal is to reduce or eliminate the value of a potential target to anyone but authorized users. The target remains, but its uselessness makes it unattractive to predators. Examples include products which self-destruct, as with some car radios when stolen, or which leave clear proof of theft, as with exploding red dye packs that stain money taken in bank robberies. Encrypted messages can often be easily intercepted, however, without the decryption code, the data are useless. Telephones, computers, automobiles, and even guns are increasingly available which can only be used with access devices such as a unique biometric identifier (e.g., retinal, voice, or geometric hand pattern), card, or access code.

1.3 Target Insulation

With this ancient technique the object of desire remains, but it is protected. Perimeter-maintaining strategies such as fences, walls, moats, guards, and guard dogs can be separated from more specific protections surrounding an object such as safes, chastity belts, goods that are in locked cases or chained to an immovable object, and the hiding or disguising of valuables. High-security, gated communities in which access and egress is controlled carefully and the use of networked sensors, alarms, and in some cases even Internet video of public areas, are becoming more common. The architectural development of ‘skywalks’ linking downtown private buildings creates ‘sanitary zones’ more subject to control than the potentially disorderly public streets below.

1.4 Offender Incapacitation

This classic strategy seeks to render potential offenders harmless with respect to the will or ability to violate the norm in question. The means may act directly on the body by permanently altering it and making certain offenses impossible—literal or chemical castration for sex offenders, cutting off the hands of thieves. Passivity may be created by tranquilizers and other medications such as Depo-Provera or psychosurgery for the violent. A variety of nonlethal restraining or blocking devices, from pepper spray to straitjackets to a net fired over a disruptive person, are available. Related efforts deal not with the body of the offender but with the instrumentalities involved in the offense. The goal is to render useless or unavailable something that is essential for the violation. For example, antidrunk driving interlock systems which require passing a breath analyzer test attached to the automobile ignition system before a car will start, or limiting gun purchases to those who have undergone computer checks for purchase eligibility (e.g., no felony conviction), or not permitting adolescents to purchase magic markers that can be used for graffiti, or mixing a bad-smelling chemical into a product to prevent it from being inhaled for its hallucinatory effects.

1.5 Exclusion

Potential offenders have been kept away traditionally from targets or tempting environments by exile, prison, curfew, and place or activity exclusions (e.g., bars for juveniles or for the home of an ex-spouse of an abusing husband). The ultimate form is capital punishment. A related form is the visible warning offered by a stigma such as the brand or clipped ear of offenders in medieval Europe, which encouraged others to stay away.

Electronic monitoring or location devices based on Global Positioning Satellites are contemporary examples. In one form alarms go off and messages are sent to authorities if an adjudicated person wearing a transmitter gets too close to a prohibited person or area or leaves an areas they are restricted to. With the human genome project completed, eugenics will become a contentious issue. For example, the belief (which ignores interactions with the environment and the socially crafted character of most rules) that DNA is linked to violence and other antisocial behavior could generate another ultimate form of exclusion— requiring a license indicating an ‘acceptable’ genetic pattern before a child could be born.

1.6 Offense Offender Target Identification

Where it is not actually possible to prevent the violation physically, or where that is too expensive, it may be possible to at least know that it took place and perhaps who is responsible and where they are. The goal is to document its occurrence, and identify the violator. A major goal of nineteenth-century forensic science was to develop reliable biometric measures of identity based on the analysis of fingerprints, facial measurements, and chemical properties (Thorwald 1965). One technique used by the former East Germany involved identifying individuals by their unique olifactors. Architectural design emphasizing visibility as a deterrent fits here (Newman 1972), as do video, audio, motion, and heat detection means and access codes that are presumed to document who enters an area, or is using a resource such as a computer. Handactivated personal alarm systems or a luggage alarm that goes off if a purse or suitcase is illegitimately moved or opened, and the electronic tagging of consumer items or expensive tools at work which give off an alarm if wrongly removed, are other examples.

New information technologies have made it possible not only to watch everyone, but for everyone to be a watcher. This greater ease of mobilizing the law by involving citizens in social control is one characteristic of the Anglo-American police tradition, although not in the rest of Europe. Citizens are encouraged to use hot lines to report infractions (e.g., erratic highway drivers, drug dealing, poaching, or ‘whistle-blowing’ regarding organizational malfeasance) via cell and traditional telephones and e-mail. The police use mass communications media to help identify and locate wanted persons via posting warrant information on web sites and crime re-enactments on television.

2. Some Other Social Control Dimensions

The six ‘ideal type’ concepts above are based on combining several aspects such as whether the focus is on the potential offender, a resource used in the violation, the offense, or the target of an offense, and whether the action involves removal exclusion, devaluation, insulation, incapacitation, or identification. Such ideal types can be useful as a shorthand for classification and comparison. Yet they also distort by combining sources of variation that can be analyzed separately.

Another approach starts with single dimensions which may cut across the different forms. Some other relevant dimensions include classifying based on: visibility or invisibility; openness or secrecy regarding use of the tactic both generally and specifically; control access into or out of a system; the individual, an organization, or a network; a focus on the body, consciousness or the environment; the degree of reliability and validity and relative cost of errors and mistakes; the relative availability and ease or difficulty in neutralizing or using a tactic; and the presence or absence of democratic decision-making and review processes regarding the adoption and application of a tactic.

There has historically been much more research on those who violate rules than on those who enforce them. We know little about the prior social correlates and consequences of the above dimensions. While anyone who travels will see a certain international standardization of technical control (e.g., at airports, national borders, banks, nuclear power plants, or with telecommuncations) there is still much local variation. Distinctive histories, social organization, and cultures mean that sweeping generalizations across all technologies and countries are unwarranted (Marx 1995). Singapore, for example, appears to have taken electronic surveillance the furthest, while Japan and China are far behind (if for different reasons). In the United States drug testing, undercover operations, citizen informers, and consumer and work monitoring are widespread, while in Europe these are seen much less often. Great Britain is the world leader in video surveillance and Germany has been among the most innovative in computer matching and profiling. The Scandinavian countries make the most extensive use of social census data as part of their welfare states. France, Italy, and Spain have been generally slower than Anglo-American countries to adopt the new technologies. While the polygraph is rarely used in Europe because of doubts about its validity, handwriting analysis has credibility in France that is lacking in most other countries.

3. Some Social And Ethical Implications

However ideal a technical control system may appear in the abstract from the viewpoint of those advocating it, the world of application is often much messier and more complicated. The search for the perfect technical fix is like the donkey forever chasing the carrot suspended in front of it. To the person with a hammer everything may look like a nail. The technology’s narrowing of focus may come at a cost of failing to see larger systemic contexts, alternatives, and longer-range consequences.

The complexity and fluidity of human situations makes this a rich area for the study of trade-offs, irony, and paradox. There are some parallels to iatrogenetic medical practices in which one problem is cured, but at a cost of creating another. Technical efforts to insure conformity may be hindered by conflicting goals, unintended consequences, displacement, lessened equity, complacency, neutralization, invalidity, escalation, system overload, a negative image of personal dignity, and the danger of the means determining or becoming ends.

Bars over windows to keep out thieves may prevent occupants from escaping through the window in the event of a fire. In commercial settings where access to merchandise is important, attaching expensive clothes (e.g., leather jackets) to a rack with a locked cable reduces the likelihood that an item will be stolen, but also complicates trying clothes on and impulse buying.

The discovery that a target has been rendered useless to an offender may increase violence, whether as a resource to gain the needed access, or out of frustration. For example, the appearance of ‘car-jacking’ is related to more sophisticated antitheft devices on cars. The use of access codes to activate autos and appliances may mean that the crime of burglary is converted to robbery or kidnapping, as thieves confront property owners and demand not only the property, but the code to make it work. A frustrated thief may respond to a self-destruct automobile radio by firebombing the car.

While the rule breaking in question may be blocked, the problem may be displaced rather than disappear or new problems may be created. In efforts to keep the homeless away from heating vents or entrances to subway systems, a variety of grates and barriers have been developed. These do nothing to solve the issue of homelessness, but move it elsewhere. Some persons manage to get off of heroin but only by becoming addicted to methadone.

Equity issues are raised when displacement involves a shift to new victims. If relatively effective technical solutions are commercialized (as with embedding hidden transmitters in cars which permits locating them by remotely activating the transmitter) or gated communities to keep out would-be thieves, predators may focus greater attention on those unable to afford enhanced levels of security.

A related equity issue involves who has access to the technology and its results. Video cameras and desktop computers are potentially egalitarian in their low cost, ease of use, and widespread availability. But other technologies such as satellites and highly sophisticated computer programs are disproportionately available to the most powerful. There is the possibility of societies becoming even more stratified, based on unequal access to information in which individuals live in glass houses, while the external wall of large organizations, whether government or private, are one-way mirrors.

Whether out of self-interested rule breaking or human contrariness, individuals can be very creative in neutralizing systems of control. Faith in the technology and its routinization may create complacency in control agents and a predictability that undermines effectiveness. That locks open with keys and borders require access points means they are eternally vulnerable.

The initial anti-drunk driving car interlock systems could be beaten by saving air in a balloon or by having someone else blow into it to start the car. A variety of means are available for beating drug tests—from contaminating the urine with bleach on one’s hand to using a catheter to insert drug-free urine into the body. Dogs in heat have been used as antidotes to male guard dogs and debugging devices help discover hidden surveillance.

In a free-market economy new controls create incentives to develop means of neutralization, whether legally or available through the black market. Police use of radar detectors against speeders was soon followed by antiradar detection devices. Police in turn developed a tool for detecting when a driver was using the latter. The guilty may now face charges for the secondary or derivative offense of possession, even if they were not speeding. Not long after antitheft ignition protection systems appeared on automobiles, a device that permitted bypassing the lock appeared.

When systems cannot be technically defeated, as with very sophisticated encryption, then their human context may be compromised, whether through coercion or deception. For example, a thief who could not break a manufacturer’s sophisticated encryption code, nevertheless managed to embezzle millions of dollars through generating fake invoices. He did this by having an affair with the individual who had the decryption codes.

There are also important questions around validity and reliability. Even if a measure is empirically valid, that does not guarantee a socially meaningful result. Thus, a DNA match between material from a crime scene and a suspect cannot reveal if a death resulted from a homicide or self-defense. The sample might also have been planted, or a secure chain of evidence custody not maintained. A computer match between persons on welfare and those with bank accounts may reveal a person over the savings limits, but that is not proof of cheating since funds may be held in trust for a funeral—something legally permitted, but not built into the computer program. Audio and video recordings may reflect what was done and said, but will not necessarily reveal why, or what a suspect intended. Seeing should not automatically mean believing. Thus, a suspect in an undercover scheme may have been threatened or entrapped off camera. A threat or seeming admission of a crime may have been said in jest or as boasting. Nor is a drug test, even if ‘valid’ indicating the presence of drugs within a person’s system, a necessary indication of a violation. Depending on the assessment used, if the standard is set low enough it is possible to have a positive reading as a result of just being in a room where marijuana is being smoked (false positive). Conversely justice issues involving false negatives may also be raised if the threshold is set too high. Drug tests may also be distorted by what an individual has eaten (e.g., poppy seeds) or by some over-the-counter medications. Tests may be of doubtful validity to begin with. Concern over the validity of the polygraph led the United States Congress to greatly restrict its use (although that led to an increase in paper-and-pencil honesty tests whose validity has also been questioned).

New control techniques may be turned against control agents. While authorities may have an initial advantage, this is often short-lived. Thus, more powerful armor, bulletproof vests, and sophisticated communication systems are no longer the sole property of police. There may be something of an escalating domestic arms race in which the interaction becomes more sophisticated, but the fundamental dynamic does not change.

Documentation of all infractions may overload the control system. This may lower morale among enforcers who feel overwhelmed, or offer corrupt officials a resource (nonenforcement) to market. Since resources for acting on all the information may not be available, authorities may be accused of discriminatory enforcement.

Even if adequate resources for full enforcement action were available, organizational effectiveness could be harmed. Automatic technical solutions developed without adequate appreciation of complexity and contingency run the risk of eliminating the discretion, negotiation, compromise, and informal understandings that are often central to morale and the effective working of organizations (Dalton 1959, Goffman 1961).

If technical solutions could somehow be effective at eliminating all rule breaking (holding apart the conflict between, and ambiguity in, and lack of consensus on, many rules), there could be some unexpected costs. Systems might become too rigid and unable to change. Much innovation is initially seen as deviance. Experimentation and risk taking can be aided by anonymity and secrecy. A socially transparent, engineered society would be more orderly, but likely less creative, dynamic, and free.

If order depended primarily on technical means of blocking infractions, rather than on legitimacy, how would people behave when the means failed, as at some points they invariably would? A social order based primarily on technical fixes is likely to be as fragile over time as one based primarily on overt repression.

Even if systems could somehow be made fool-and fail-proof with ever more, and more advanced, technology, there is a danger of viewing humans as robots, rather than as creative beings with a conscience capable of making choices about how they behave. The former image is inconsistent with belief in the dignity of the autonomous individual in a democratic society. Whatever a technology is capable of, the view of humans as volitional (and hence responsible for their behavior) and beliefs about the inviolability (absent clear justification) of the borders that protect private personal zones around one’s body, mind, relationships, communications, physical space, and past history are central to ideas of respect for personhood.

The search for stand-alone mechanical solutions also avoids the need to ask why some individuals break the rules and points away from examining the social conditions which may contribute to violations and the possibility of changing those conditions. Technical solutions seek to bypass the need to create consensus and a community in which individuals act responsibly as a result of voluntary commitment to the rules, not because they have no choice, or only out of fear of reprisals.

A well-known, if often naıve expression given social inequality, holds that where there is a will there is a way. This speaks to the role of human effort in obtaining goals. With the control possibilities made available by science and technology this may be reversed to where there is a way there is a will. As the myth of Frankenstein implies, we must be ever vigilant to be sure that we control the technology rather than the reverse. As Ellul (1964) argues, there is a danger of self-amplifying technical means silently coming to determine the ends or even becoming ends in themselves, divorced from a vision of, and the continual search for, the good society.

Bibliography:

  1. Altheide D 1975 The irony of security. Journal of Urban Life 4: 175–95
  2. Andreas P 2000 Border Games Policing the US–Mexico Di ide. Cornell University Press, Ithaca, NY
  3. Beniger J 1986 The Control Revolution: The Technological and Economic Origins of the Information Society. Harvard University Press, Cambridge, MA
  4. Bennett C, Grant R (eds.) 1999 Visions of Privacy: Policy Choices for the Digital Age. University of Toronto Press, Toronto, ON
  5. Brodeur J 1983 High policing and low policing remarks about the policing of political activism. Social Problems 30: 507–20
  6. Byrne J, Lurigio A, Petersilia J (eds.) 1992 Smart Sentencing: The Rise of Intermediate Sanctions. Sage, Beverly Hills, CA
  7. Clarke R (ed.) 1997 Situational Crime Prevention: Successful Case Studies. Harrow and Heston, New York
  8. Clarke R A 1988 Information technology and dataveillance. Communications of the ACM 31: 498–512
  9. Cohen S 1985 Visions of Social Control. Polity Press, Cambridge, UK
  10. Dalton M 1959 Men who Manage. Wiley, New York
  11. Ellul J 1964 The Technological Society. Knopf, New York
  12. Ericson R, Haggerty K 1997 Policing the Risk Society. University of Toronto Press, Toronto
  13. Ericson R, Shearing C 1986 The scientification of police work. In: Bohme G, Stehr N (eds.) The Knowledge Society: The Growing Impact of Scientific Knowledge on Social Relations. Reidel, Dordrecht
  14. Foucault M 1977 Discipline and Punish: The Birth of the Prison. Vintage, New York
  15. Gandy O 1993 The Panoptic Sort: Towards a Political Economy of Information. Westview Press, Boulder, CO
  16. Gibbs J 1989 Control Sociology’s Central Notion. University of Illinois Press, Urbana, IL
  17. Goffman E 1961 Asylums. Anchor Books, Garden City, NJ
  18. Graham S, Marvin S 1996 Telecommunications and the City: Electronic Spaces, Urban Places. Routledge, London
  19. Janowitz M 1975 Sociological theory and social control. American Journal of Sociology 81: 82–108
  20. Laudon K 1986 The Dossier Society Value Choices in the Design of National Information Systems. Columbia University Press, New York
  21. Lowman J, Menzies R, Payys R (eds.) 1987 Transcarceration: Essays in the Sociology of Social Control. Gower, Aldershot, UK
  22. Lykken D 1981 A Tremor in the Blood: Uses and Abuses of the Lie Detector. McGraw Hill, New York
  23. Lyon D 1994 The Electronic Eye. Polity Press, Cambridge, MA
  24. Lyon D, Zureik E (eds.) 1996 Computers, Surveillance and Privacy. University of Minnesota Press, Minneapolis, MN
  25. Manning P 1992 Information technology and the police. In: Tonry M, Morris N (eds.) Modern Policing. University of Chicago Press, Chicago
  26. Marx G T 1988 Undercover: Police Surveillance in America. University of California Press, Berkeley, CA
  27. Marx G T 1995 The engineering of social control: The search for the silver bullet. In: Hagan J, Peterson R (eds.) Crime and Inequality. Stanford University Press, Stanford, CA
  28. Marx G T 1995 Undercover in comparative perspective: Some implications for knowledge and social research. In: Fijnaut C, Marx G (eds.) Undercover: Police Surveillance in Comparative Perspective. Kluwer Law International, The Hague, The Netherlands
  29. Newman O 1972 Defensible Space. MacMillan, New York
  30. Norris O, Moran J, Armstrong G 1998 Surveillance, Closed Circuit Television and Social Control. Ashgate, Aldershot, UK
  31. Office of Technology Assessment 1985 Federal Government Information Technology: Electronic Surveillance and Civil Liberties. USGPO, Washington, DC
  32. Perrow C 1984 Normal Accidents. Basic Books, New York
  33. Rule J 1973 Private Lives, Public Surveillance. Allen-Lane, London
  34. Shenhav Y 1999 Manufacturing Rationality. The Engineering Foundations of the Modern Managerial Revolution. Oxford University Press, New York
  35. Sherman L 1992 Attacking crime: Policing and crime control. In: Tonry M, Morris N (eds.) Modern Policing. University of Chicago Press, Chicago
  36. Snyder E, Blakely M 1997 Fortress America: Gated Communities in the United States. Brookings Institution, Washington, DC
  37. Tenner E 1997 Why Things Bite Back: Technology and the Revenge of Unintended Consequences. Knopf, New York
  38. Thorwald 1965 The Century of the Detective. Harcourt Brace & World, New York
  39. Web: mit.edu/gtmarx/www garyhome.html
  40. Weber M 1958 From Max Weber Essays in Sociology. Oxford University Press, New York
  41. Zuboff S 1988 In the Age of the Smart Machine. Basic Books, New York
Technology Assessment Research Paper
Technology And Organization Research Paper

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get 10% off with the 24START discount code!