Public Health Research Paper

Academic Writing Service

Sample Public Health Research Paper. Browse other  research paper examples and check the list of research paper topics for more inspiration. If you need a religion research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our research paper writing service for professional assistance. We offer high-quality assignments for reasonable rates.

Calling to mind Pasteur, his students and successors, Frederic Gorham (1921) wrote, ‘In fifty short years, the world has been transformed by their discoveries.’ A ‘blissful optimism’ surrounded public health. In the following decades, the advent of DDT, sulfa drugs, penicillin, and more recent antibiotics would endow public health doctors with unshakeable self-confidence. In contrast, the rapid expansion of health care systems during the 1960s signaled a decline for public health until the ‘comeback’ of infectious diseases, the prevalence of chronic illnesses, the increasing number of ‘complex emergencies’ (famines, wars, ‘ethnic cleansing,’ natural catastrophes), the growth of inequality, the development of resistance against proven drugs, the proliferation of technological risks, the economic recession and the spread of poverty—all proved the need for stronger structures and political support, especially for monitoring and controlling public health.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


In the nineteenth century, sanitarians saw poverty as a product of disease. In the twentieth century, social medicine reversed the relation by affirming that destitution and exploitation bred disease. Although there is hardly any doubt that malaria, to take one example, is a cause and not an effect of underdevelopment (in Kenya and Nigeria, it costs from two to six percent of the GNP), can we not see the aggressive return of poverty in industrial societies since the 1980s or the ongoing impoverishment of the Third World as substantial evidence that increasing poverty is the major factor in current health crises?

Trends in health policy closely depend on the relation between the requirements of social justice and the objectives of political authorities. Debate about the utopia of peaceful cooperation among citizens and among nations along with the political decision to provide for national security—of which ‘health security’ is a part—have marked public health’s history. What is original at the end of the twentieth century is that public health has been drawn off the sidelines of charity, its traditional quarters, and placed center field in world politics and nation-state strategies.




1. Health As A Local Concern

Under exceptional circumstances, public health rises to the rank of a duty for centralized states. Such has been the case when the purpose was nation building or the protection of the national territory and its population. But under usual circumstances, public health has been a local matter.

Owing to these deep local roots, the philosophical grounds for public health have been the science and art of preventing illness and of furthering the health and well-being of all thanks to an organized community effort. According to C. E. A. Winslow (1923), public health involves far more than granting health authorities police powers; it is ‘not a concrete intellectual discipline, but a field of social activity.’ This is still an accepted notion. In 1936, certain thinkers maintained that medicine was not a branch of science but that, if a science, it was then a social science. This position was decisive not only because it sparked debate about the history of health (as part of social history or of the history of science?) but also because it foreshadowed another debate, namely, the current controversy between public health as a social science and epidemiology as a biomedical science. Public health is a practical science halfway between the laboratory and the public administration. Public health doctors must be, all at once, experts in the field of prevention, judges in matters of community health, and leaders. More than just clinicians, they are (in Winslow’s words) scientists, administrators, and statesmen.

At present, there is no consensus about public health, about its duties, jurisdiction, and qualifications. Historically, these have seldom been clearly defined in relation to medicine, bacteriology, epidemiology, and laboratories—a sensitive question in tropical medicine. Starting in 1948, and even more as of 1965 in the United States, the gap between biomedical research (National Institute of Health), epidemiology (Centers for Disease Control), and public health (US Public Health Service) has never stopped widening.

Public health has been understood as a government duty—by revolutionaries in 1848 or the architects of state medicine in Great Britain. This is still a generally accepted idea, even in the United States despite opposition in the past not just from doctors but from hygienists themselves. Despite a chronic staff shortage, public health in the USA still provides a wide array of services coordinated by federal, state, and local governments. As the Secretary of Health, Education and Welfare declared in 1979, these health services represent ‘a governmental presence at the local level.’ Public health’s history has been marked by swings back and forth between a narrow focus on issues and a broad public duty, between preventive and curative medicine, between hygienists and doctors.

Noting the epidemiological transition underway during the 1920s, Winslow wanted public health to restore the broken link between preventive and curative medicine, and to infuse students and practitioners with a ‘spirit of prevention.’ At that time, medical schools in Scandinavia and, of course, the Soviet Union offered courses in social medicine. In contrast, the 1933 declaration of the Committee of Administrative Practice of the American Public Health Association (of which Winslow had been the first president) deliberately split the field of health in two. Offices of hygiene were assigned the tasks to fight against contagious diseases and protect children’s health, as reflected in the new definition of the ‘basic six’ duties of local health institutions (statistics, sanitation, contagious diseases, laboratories, maternal and child care, health education) and in the national plan (‘Local health units for the nation’) published by the aforementioned committee in 1940 and 1945.

Public health still has far to go to resolve its ambiguous assignment. Basing their argument on the variety and multiplicity of health determinants, the authors of WHO’s 1999 Report stated that public health doctors should measure the effects on health of factors lying outside the field of medicine. But just as soon, they admitted that these doctors had quite limited qualifications, jurisdiction, and credibility outside their field, and that they could better concentrate their energy in the health field.

2. The People’s War For Health

Public health cannot deny that it leaned for a long time toward social reform—specifically, toward sociology at a time when the latter was more a form of social and political analysis than the academic discipline it has since become. Even before World War I, Newsholme in Brighton, Calmette in working class areas in northern France, Goldberger and Sydenstricker in the mining towns of South Carolina and, following the war, Stampar in the Balkans, Rubinow, Sigerist, and the Milbank Memorial Fund in the USA, Rene Sand a little later in Belgium, a current was proclaiming itself the heir of Virchow (or rather of the threesome Frank–Virchow–Grotjahn) in Germany, social medicine’s homeland. Eighty years later, the advocates of social medicine, public health, and human rights still lay claim to this/heritage.

For Warren and Sydenstricker in 1918, overcrowding and insalubrious housing were the major determinants of inequality in health. Illness and health had social, economic, as well as biological causes; Sigerist vastly widened this range of health determinants by including cultural causes. Public health—a national health program—should set the goal of universal protection so as to ‘equalize social classes in relation to health’ in the words of the Pasteurist Etienne Burnet in 1935. More recently, an interpretation prevailing in some quarters following Alma-Ata (1978) has sought to ‘put medicine where it belongs’ by seeing public health as reaching far beyond illness treatment and prevention. But how to implement this ‘social medical contract,’ which called for redefining democracy?

The idea of social medicine was not at all unanimously approved. Typical of differing opinions was the debate about what caused malaria: an illness bred by poverty and poor hygiene, as some claimed, or a clinical and entomological problem that, as others asserted, called for eradicating the vector rather than poverty? The very phrase ‘social medicine’ was cause for debate. Did it not risk making people believe public health should be based on the medical profession, whereas its only true allies were local authorities and the people?

The 1929 depression had a part in this shift toward a more sociological conception of public health. Studies in nutrition or on the links between unemployment and health took sides with a movement opposed to eugenics. Social medicine was but a clinical medicine aware of the social determinants of a patient’s state of health. This awareness would inevitably call for ‘socializing’ health systems and opening health care to everyone. But ‘social’ should not be confused with ‘socialist.’ The three terms ‘socialsocialized-socialist’ tended to correspond to a political geography that might, on closer examination, turn out to be little more than a series of cultural gradients: Europe tended to be more socialist as one moved from areas using horse-power to areas using horses! The Russian ‘example’ must be set in this light. The Webbs, Winslow, Sigerist, like delegates at the 1978 Alma-Ata Conference, were enthusiastic about Soviet medicine (or even about community medicine in Russia since Alexander II’s reign). Even though Moscow had proclaimed the doctrine of the social origins of disease, this guiding Eastern light could not outshine other sources of luminosity. In Socialist Berlin and Vienna— not to mention East Harlem—health centers and dispensaries were springing up that provided maternal and child care as well as birth control.

Once ‘war Communism’ came to an end, social medicine in the former USSR took a radical turn into state medicine. As conveyed to the USA under Franklin D. Roosevelt, it served, for a while, as a political alternative to the mercantile spirit of the American Medical Association. In the absence of a Federal health plan, Oklahoma and Maryland, with the backing of the Farm Security Administration, farmer organizations, labor unions, and medical cooperatives, actively defended rural interests. In its radical versions, social medicine focused on zones with high rates of poverty, malnutrition, and child mortality—areas abandoned by hospitals, which, located in cities, were too far away and too expensive. In the late 1960s once again, Tanzania gave priority to rural areas as it built a decentralized system, covering the whole country, of health stations and dispensaries, and appointed village health agents. In 1972, the proportion of funds appropriated there for rural compared with urban health services was 20:80. Eight years later, it stood at 40:60. But WHO did not reorient its primary health care strategy until 1989 in order to cope with massive rural flight toward urban areas.

3. Alma-Ata, The Peak Of Convergence

Following World War II, social medicine experienced a golden age. The coming to power of the socialist Cooperative Commonwealth Federation in Saskatchewan, Canada in 1944, and the emergence of the New Democratic Party at the national level in 1961 opened the way for setting up, between 1962 and 1971, a national health insurance based on values of solidarity and social justice. Even in the United States, a hierarchical organization of health centers existed in New York City. In contrast, social medicine was losing wind in Europe even as it was attaining official recognition by international organizations.

Social medicine—‘promotive, preventive, curative, and rehabilitative,’ to borrow Sigerist’s words—was crowned in Alma-Ata at the joint WHO–UNICEF conference on September 12, 1978. Potentially subversive ideas had been carefully defused by replacing certain phrases (e.g., ‘health centers’ with ‘primary health care’ or ‘barefoot doctors’ with ‘community health workers’) in the ‘Health for all by the year 2000’ declaration already approved by WHO’s 30th General Assembly in May 1977. This 1978 Alma-Ata Conference represented a peak for the theory of convergence, itself the outcome of thawed relations between the two blocs, as the former Soviet Union sought to cooperate with the West in science and technology while still exporting its model of development toward the Third World. Shortly after the conference, WHO’s regional committee for Europe would refer to this theory as ‘Equity in care.’ Borne by the expanding social sciences and by development theory (after all, ‘Health for all by the year 2000’ was intended to be an indispensable part of development), this theory stipulated that differing social systems would ultimately converge into a single ‘human society’ for satisfying all basic needs (food, housing, health care) after eliminating poverty. Current in the 1960s and 1970s, this ideology might be seen as a new form of the functionalism that had thrived 30 years earlier at the League of Nations. Food, housing, social insurance, paid holidays, cooperatives, leisure activities, education, and especially public health, all this would shrink the field of politics as welfare expanded. Would national administrations not be integrated into a single international system that would free Geneva from the domination of sovereign states?

The Alma-Ata conference defined the public for primary health care in terms of high morbidity and mortality rates and a socioeconomic status with risks for health. Health education, a safe water and food supply, maternal and child care (including birth control), and vaccination (against tuberculosis, diphtheria, neonatal tetanus, polio, and measles) were declared priorities along with universal access to health services and the necessity for coordinating health with education and housing. Primary healthcare was to be based on the most recent scientific developments, which were to be made universally available—what Aneurin Bevan in 1946 had called ‘universalizing the best.’ Adopted by more than 100 states in 1981, the 1978 Alma-Ata Declaration sought to reflect criticisms both of the Western medical model and of the population explosion in poor lands. In particular, it systematically demolished the ‘vertical’ programs that WHO had, until then, undertaken with quite inconsistent results (success against smallpox but a relative setback in combating malaria). Based on participatory democracy, ‘peripheral’ services were to lie at the core of the new strategy, but were also to be integrated into ‘districts’ around a first referral-level hospital (140 beds, internal medicine, gynecology, obstetrics, a laboratory and blood bank, and the capacity for performing minor surgery and providing outpatient care). The aim was not to juxtapose primary care with a still centralized system but to combine the advantages of hospitals and of health centers so as to facilitate both prevention work and access to medical and social services for a poor, often scattered, population.

To its credit, primary health care has made major, though inconsistent, progress. In Colombia, the 1500 health centers set up as of 1976 were covering 82 percent of the population by the early 1980s. Between 1978 and 1982, the percentage of the vaccinated population rose from 23 percent to 41 percent for polio, 36–71 percent for tuberculosis, 21–50 percent for measles, and 22–37 percent for diphtheria, tetanus, and pertusis. If, by 1989, 90 percent of babies less than one year old had been vaccinated against polio in Cuba, Botswana, Brazil, Nicaragua, China, Tunisia, Egypt, Korea, and Saudi Arabia, in India this rate did not reach 20 percent in the two-thirds of the poorest strata of the population, where infant mortality is the highest.

True, the methods used were sometimes counterproductive. As we know, the Expanded Program on Immunization (launched in 1974 and broadened in 1987 to include polio) had better results in places with a solid health infrastructure. In continuity with the Rockefeller Foundation’s campaigns against yellow fever in Latin America, and those against ‘sleeping sickness’ in the Belgian Congo or French Equatorial Africa, or the campaigns against smallpox organized by WHO in association with Indian and Bengali authorities in 1962 and 1967, international organizations still preferred mass vaccination programs conducted like military campaigns. Coercion was all the more willingly used insofar as vaccinators blamed the superstitious population for any opposition.

The installation of ‘community health agents’ elected by villagers, would, it was believed, overcome local opposition. This lightweight health auxiliary, a sort of ‘barefoot doctor,’ would bridge the gap between the health system and rural dwellers. These experiments were disappointing: this community health agent often turned out to be a ‘transitional last resort’ who fell victim to the envy aroused by power or fell into the ‘voluntary help’ trap. True, public health provided certain governments (in Cuba, Sandinist Nicaragua, or Nyerere’s Tanzania) with means for mobilizing the population and nation building. But five years after the signing of the Alma-Ata agreement, only experimental or small-scale projects were in the works, mainly as a function of circumstances, when a local leader showed concern for his constituents’ welfare.

Under the pressure of users, prevention work in health centers often deviated toward more curative purposes. Like state medicine in the Soviet Union, social medicine in the Third World represented a limited injection of ‘collective vaccine’ in an ‘individualistic’ village setting. Except for Scandinavia, where national identities are strongly imbued with concern for health and with eugenic ideas, Europe, too, has tended to resist this type of medicine.

Primary health care was originally intended to better distribute public expenditures by avoiding congestion in big urban hospitals—an objective that the Harare Conference, organized by the WHO, reaffirmed in 1987 but that has been a resounding failure. At the Bamako meeting organized by WHO and UNICEF in 1987, African ministers of Health had to admit this, but then went on to ratify a proposal for furthering decentralization and making health care users pay part of the costs. Applied in 33 countries by 1994 (28 of them in sub-Saharan Africa), the Bamako Initiative looks like an antidote to Alma-Ata.

4. Transnational Networks And The Territory’s Revanche

Symbolized by the publication of the 1993 World Bank report, Investing in Health, the waning of WHO—which as recently as 1973 had been said to be the ‘world’s health conscience’—signals a turning point. In Africa, from 60 to 70 percent of health budgets come from outside sources: non-governmental organizations (NGOs), the World Bank, or the International Monetary Fund. The World Bank’s centralized approach allows little room for a local or national diagnosis. Zimbabwe, where expenditures on prevention work had increased significantly from 1982 to 1990, was, in 1991, summoned to reduce its public health investments by 30 percent in line with the Bank’s recommendations for shifting funds toward ‘selective’ and more productive primary health care activities and toward prevention work for reducing maternal and infant mortality (such as growth-monitoring, rehydration, breast-feeding, and immunization). The Bank’s recommendation to start making users pay may well reduce the utilization of health services and thus worsen the prevalence of endemic diseases such as malaria or AIDS. Another drawback is as follows: the need to obtain fast results in order to please donors keeps governments who receive aid from acquiring the ability to manage in the long term.

The World Bank’s growing role in financing the health field can but amplify its intellectual role and lead to neglecting WHO’s abilities and experience. Unlike Geneva, whose relations with national administrations are still limited to (weak) ministries of Health, the Bank has direct access to the centers of power (ministers of National Planning or Finances). It may even, as happened in Bangladesh, take the place of the state in heading health policy. As happened with the interventions by the Rockefeller Foundation or the Milbank Memorial Fund in France during the 1920s or in several central European and Balkan lands, the organization of public health in low-income countries comes to be based on an imported state. Since the 1980s, however, the big foundations have given up their place in the field to other non NGOs specializing in humanitarian aid. The number of NGOs rose from 1,600 in 1980 to 2,970 in 1993. A trend on this scale is a sign of the times, namely: the proportion of funds reserved for emergency actions in war zones has increased constantly since 1990. In Asia, Africa, or the Balkans, public health is, once again, seen against the backdrop of warfare.

Welfare and warfare were, indeed, intimately connected between the two world wars. Interventionism thrived with World War I and the ‘Spanish flu’ epidemic. As a consequence of growing government interventionism between 1914 and 1918, the first health ministries sprung up in Europe, New Zealand, and South Africa. Following the Great War, there was a general awareness of the role of public health in reconstruction and the dangers of mass epidemics (such as cholera or typhus) for political stability. World War I stimulated reformers like Dawson, who can rightfully be considered to be a forerunner to the British National Health Service (1948). But by doing away with taboos surrounding the respect for life, the war also opened wide the doors for the excesses committed by a racial form of hygienics.

Throughout the inter-World War period, a tragic dialectics prevailed in the international health order (at least in Europe), a constant swinging back and forth between the triumph of networks of transnational experts and the revanche of the territory, i.e., the nation or ‘race.’ Hitler’s regime did not hesitate to use public health for a policy genocide. Its fight against typhus in Poland led to ‘the final consolidation of the notion of a quarantine’ in the Warsaw ghetto (Browning 1992). This was proof of the potential lethality of the health sciences.

As the twenty-first century begins, war is, once again, menacing health. It has even been said that ‘today’s armed conflicts are essentially wars on public health’ (Perrin 1999). Civilian populations are the first war casualties. Since the wars in Abyssinia and Spain during the 1930s (well before Rwanda and the first Chechnya war), 80 percent of the total 20 million persons killed and 60 million wounded in various conflicts have been civilians. Three out of five have been children. The most intense war zone in Africa lies at the heart of the continent, which is also the epicenter of new diseases (Ebola) and epidemics (HIV/AIDS). But civilians fall casualty not just as a result of war’s toll on health, as it sets off mass migrations and disrupts food supplies. Between 1982 and 1986, fighting destroyed more than 40 percent of Mozambique’s health centers. By interrupting routine activities related, in particular, to maternal and child care, war increases mortality and the number of premature births. It also reduces vaccination coverage, not to mention its long-run effects, such as outbreaks of malaria as people seek refuge in infested areas. According to a 1996 UNICEF estimate, war-related shortages of food and medical services in Africa, added onto the stress of flight from battle zones, accounted for 20 times more deaths than actual fighting. On the one hand, the renewed threat of infectious diseases (in particular, the interaction between HIV and tuberculosis in poor countries) and, on the other, the consequences of the epidemiological transition and the aging of populations in wealthy lands, this is the ‘double burden’ that, according to the WHO, public health will have to carry in the twentyfirst century.

Prevention work now represents a very small percentage of public health expenditures: 2.9 percent in the United States, a statistic similar to that in most other industrialized lands. However, this percentage includes only prevention activities such as screening for disease or vaccinating. It is not to be confused with ‘prevention education,’ such as campaigns against smoking. ‘Health for all by the year 2000’ is still a far cry from reality, even in wealthy nations.

The somewhat vague phrase ‘health promotion,’ officially used in the British Health Ministry’s 1944 report, refers, all at once, to health education, taxation as a health policy instrument, and the prohibition against advertising alcoholic beverages or tobacco. This phrase was revived in the 1970s by the Canadian Health Ministry to support the thesis that a population’s state of health depends less on medical progress than on its standards of living and nutrition. The chapter devoted to the fight against nicotine addiction in the ‘Health for all’ program has been followed up on, since it does not cost the government anything and can even, as in Australia, help pay for ‘health promotion’ thanks to cigarette taxes. To its advantage, it also garners broad support for public health. The keynote of the 1986 Ottawa Charter, the phrase ‘health promotion’ indicates how closely prevention work in wealthy countries hinges on the system of medical care. The state itself participates in this trend to make individuals responsible for health. Witness the campaigns focused on tobacco, alcohol, or certain foodstuffs with the objective of changing ‘lifestyles’—a far cry from the ‘old’ public health which depended on the dispensary and other facilities as the basic means of education.

Consequent to discussions in the late 1970s about women’s role in the sociology of health, the awareness dawned that more persons are involved in health care in the often neglected ‘informal’ sector than in public or private health services. Everywhere around the world, family, kin, and neighbors bear the largest burden of care for the sick. Evidence of this awareness is, on the right, the defense of the family’s role and, on the left, the activism of patients’ associations (especially homosexual associations in the fight against AIDS). Owing to its ability to mobilize various sorts of actors with different opinions on anything other than health issues, thus ‘civil’ society is partly taking the state’s place in both wealthy and poor countries. As this emergence of local actors once again proves, applying decisions in prevention work is not at all a purely administrative matter.

Have governments been put to rout? But public health raises problems directly related to the state’s powers. Grounding its arguments in new bacteriological concepts, New York City’s Health Service had no qualms at the start of the twentieth century about locking up a 37-year-old Irish cook (‘Typhoid Mary’ Mallon) for having infected 32 persons. Given the tragic affairs of blood transfusions in France and ‘mad cow disease’ in the UK, public health now seems to have become a fully fledged national security issue. The most recent evidence of this is the ‘principle of precaution,’ which—the ideology of national sovereignty tempered with expertise—would entail political decisions for strengthening public health. State security increasingly means taking into account not only political and military threats but also public health and environmental risks. As the concept of security expands into a ‘shared human security,’ monitoring transmissible diseases is becoming a major concern for public authorities.

On the threshold of the twenty-first century, a global public health strategy would entail relating the fight against transmissible diseases and environmentally related risks to the management of emergencies. But was this not also the issue 100 years earlier? The twentieth century started with the French–English dispute about cholera, which centered around the quarantine issue. The conflict between national independence and public health has persisted during controversial affairs all the way down through the century. Seen from Europe, the century seems to be ending with a clash about ‘mad cow disease’ between Great Britain, on the one hand, and Germany and France, on the other. By reviving the old Churchill myth of ‘Britain alone against Europe,’ the beef war (1996–2000) fully illustrates that public health, far from being an ordinary administrative matter, can turn into a fully fledged political controversy and field for government actions.

Bibliography:

  1. Browning C R 1992 Genocide and public health: German doctors and Polish Jews, 1939–1941. In: The Path to Genocide. Harvard University Press, Cambridge, MA
  2. Burnet E 1935 Medecine experimentale et medecine sociale. Review of Hygeine 57: 321–42
  3. Fee E, Brown T (eds.) 1997 Making Medical History: The Life and Times of Henry E. Sigerist. The Johns Hopkins University Press, Baltimore, MD
  4. Gorham F P 1921 The history of bacteriology and its contribution to public health work. In: Porcher Ravenel M (ed.) A Half-Century of Public Health. American Public Health Association, New York
  5. Perrin P 1999 The risks of military participation. In: Leaning J, Briggs S M, Chen L C (eds.) Humanitarian Crises: The Medical and Public Health Response. Harvard University Press, Cambridge, MA
  6. Viseltear A J 1982 Compulsory health insurance and the definition of public health. In: Numbers R L (ed.) Compulsory Health Insurance: The Continuing American Debate. Greenwood Press, Westport, CT
  7. Warren B S, Sydenstricker 1918 The relation of wages to the public health. American Journal of Public Health 8: 883–7 [1999 American Journal of Public Health 89: 1641–3]
  8. Webster C (ed.) 1995 Caring for Health: History and Diversity. The Open University Press, Buckingham, UK
  9. Winslow C E A 1923 The Evolution and Significance of the Modern Public Health. Yale University Press, New Haven, CT
Public Health As A Social Science Research Paper
Migration and Health Research Paper

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get 10% off with the 24START discount code!