Medical Schools Research Paper

View sample medical schools research paper. Browse research paper examples for more inspiration. If you need a health research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our writing service for professional assistance. We offer high-quality assignments for reasonable rates.

Introduction

Medical schools are the principal institutions created and supported by world society to train physicians in the science and art of ministering to the health needs of individuals and communities. Although modern medical schools (medical faculties) are centers for many other health-related social and investigative functions, they are primarily charged to produce well-trained ‘compassionate healers,’ certified to have completed basic academic courses and practical work in medical care and cure. Other institutions also train professionals to improve human well-being, but medical schools – in a form virtually unchanged for almost a century – remain the world’s principal academic institutions designed to supply practitioners to care for human malaise, disease, and illness.

For many, the training programs and goals of medical schools appear relatively noncontroversial. Society’s mandate in improving medical care worldwide is to support and fund teaching institutions which emulate the so-called ‘Western’ or ‘Flexnerian’ medical education model. This academic model – scientifically grounded, evidence-based, university-affiliated, individual and cure-oriented, and infinitely reproducible, is designed to foster the recruitment, training, and certification of practitioners to best care for the physical and mental frailties of mankind. Today’s medical school training, grounded in science-based subjects believed to be essential to the rigorous medical curriculum, is essentially the same throughout the world, with academic and scheduling differences based on culture, governmental input, disease patterns, and the social and medical needs of each individual geographic area. In virtually all countries, academically prepared young men and women are selected, given a basic background in the scientific arts, exposed to advanced didactic and clinical skills in specific medical disciplines by competent practitioners, and – based on their successful completion of training – are licensed by society to use learned skills to care for and cure individuals. Once licensed, these new physicians are free to pursue advanced training, move into related fields, or follow other – usually health-oriented – paths. In most countries, completion of medical school and the awarding of the graduate degree brings with it significant social and economic rewards.

Closer examination of the role of medical schools throughout the world – and ultimately of medical practice itself – reveals, however, an infinitely complex educational scenario strongly linked to wider societal issues. The academic institutions, curricula, and certification processes grafted onto the scientific discoveries of the last 150 years have served the medical needs of the world beyond expectation; everywhere, science-based medicine has brought substantial gains in both individual and community medical care and cure. As a result, the schools, the medical profession, and world society itself have accepted, virtually without question, the assumed truth of the so-called ‘Great Equation’: medicine equals health. According to this view, improved human health, both individual and community, is based on providing the best of scientific, curative medicine to the individual. Medical care – supplied by well-trained physicians – will provide well-being to all.

One of the most difficult ideas for our society to appreciate is the concept that medical care contributes a relatively small component to overall human health. Although our society has come to regard ‘the Great Equation’ as the ultimate truth, health is a much larger concept. Human ‘dis-ease’ is dependent on many factors beyond the scope of medicine. In 1978, in the Alma Ata Declaration, the World Health Organization defined health as ‘‘a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.’’ This view – for some, much too all-encompassing – clearly separates the role of medicine as an essential contributor to health from a larger view of human well-being. Few would question that well-trained physicians are essential for medical care and, indeed, for many aspects of human health, but are medical practitioners the only and best providers of health and well-being? In today’s world are they sufficiently well trained in the many skills necessary to take care of the myriad problems that affect the physical and mental life of individuals and society? Nevertheless, despite the problems surrounding the definition of health and the best way to provide ‘health for all,’ our society now sustains and supports a network of medically oriented, science-based educational institutions which continue to be our primary training centers for improving health.

The form of these institutions is largely the product of centuries-old societal forces that have brought substantial changes to all of our political, economic, and cultural institutions. The Western model of medical education and practice is founded on the belief that reason, science, and rational experimentation, not superstition and unproven metaphysical concepts, ultimately govern the quality of human health. The increasing secularization of the human world finds ultimate expression in the health arena, where man and his institutions have the capacity to modify all diseases which affect mankind, cure all manner of illness and disability, and provide expertise, medications, and counseling to improve human existence.

As society comes to examine more closely the definitions of health, medicine, cultural well-being, and the institutions it creates to support its views, it is beneficial to look back to determine how the age-old medical teaching process evolved and evaluate where medical schools might be going in the future. The schools are reflections of the needs and goals of a greater world society: the development of medical teaching occurs within the larger framework of humanity’s overall needs. As the needs of society change, so too does the nature of its health practitioners and the institutions which train them. Over the centuries the types of practitioners have reflected both what society demands and the knowledge available to support and expand the health of individuals within that society.

This research paper deals primarily with issues surrounding the history, evolution, and future of Western medicine. Although not denigrating the more traditional medical education models which are a part of every society, this research paper accepts that the world community has generally endorsed and made resources available to sustain the Western medical teaching model. Although the tenets and practices of many of the classical, more ‘pastoral’ schools and their theories of health and illness may ultimately have greater impact on the overall health of mankind than does the current model, the history of today’s medical schools as societal institutions is – rightly or wrongly – primarily the story of the rise of Western medicine and the teaching philosophies and institutions that support it. This research paper discusses briefly the pre-1700s teaching institutions. Although the roots of medical training go back for centuries, both the form and content of our present teaching institutions are of relatively recent origin; their current and future structure draws heavily on older philosophies, but the story of today’s medical school rests primarily in the more recent past.

Early Medical Education (Pre-1500)

It is difficult to determine the specific nature or academic content of early medical training, primarily because of the lack of verifiable records. That medical ‘schools’ have always existed is without question. All societies require someone to minister to human health needs, and in the records of every civilization there are references to ‘schools’ and the teaching of future practitioners by already established healers. The manner in which students were selected, trained, and certified is not well defined, but virtually all early medical schools operated under some type of an apprentice system. This apprenticeship system was not unique to medicine; prior to the 1800s, all ‘trades’ including the church, operated under a similar system. In medicine’s case, a respected healer or master craftsman whose reputation for medical powers was usually known by word of mouth, whether in Greece or Rome, India, China, or the Arab world, gathered promising disciples (apprentices), taught them what he (or occasionally she) knew of health, disease, and the interaction with patients, introduced the students to clinical problems among individual patients, and, on completion of their training, graduated them to minister to the needs of society. Because the causes of illness were virtually unknown and the sources of human maladies undefined, medical training had strong early links to the culture’s philosophical view of life, the relationships between nature and the supernatural and metaphysical worlds, and how external reality affected the health of each individual.

There were three major themes characteristic of virtually all early medical teaching. First, the earliest societies needed trained practitioners who cared for and cured human ills. Because of the reliance of these societies on philosophies which saw humankind as subordinate to some higher power, effective practitioners were often shamans or priests or those who acted as intermediaries between humans and the power of larger forces. A successful interpreter of these relationships transmitted this knowledge and his understanding of natural remedies to appropriate students. Second, in the absence of any real science-based knowledge about the workings of the human body, the origins of disease, or of effective treatments, early healers generally sought to achieve and teach the need for inducing some type of mystical ‘balance’ either within the individual or between the individual and greater, supernatural forces. The most perfect health existed when Hippocrates’ or Galen’s four humors (black bile, yellow bile, phlegm, and blood) were ‘duly proportioned’ within the human body. Humors were related to the seasons, the environment, the elements, and parts of the body. Ayurvedic (knowledge of life) healing in India saw human health in terms of a balance between hot and cold and among wind, bile, and phlegm; it depended on magical, quasi-religious explanations for the sources and amelioration of illness. Third, the lack of any verifiable explanations for disease and illness allowed a successful practitioner to teach disciples whatever he believed would cure human ills. Until late in the medieval period, religious laws severely restricted the dissection of cadavers so that even the most basic understanding of how the body worked was based on mystical supposition. To be sure, many early practitioners could recognize symptoms and signs of specific diseases, but they possessed no verifiable, physiologically based explanation of causation, nor any proven method by which to treat the majority of maladies. For example, the symptoms of syphilis were well described in earlier medieval observations but were attributed to such phenomena as the ill effects of wind, the misalignment of the planets or the production of faulty humors within the human body. There was virtually none of today’s scientific rationale behind instruction in care and cure; most philosophies of disease, illness, and infirmities invoked a pre-scientific approach in which suitable internal or external balance had been lost, often because the individual had somehow annoyed or angered the gods or god.

In the absence of any effective treatments, religious leaders often provided mystical answers to man’s illnesses. Certain saints were protectors, amulets safeguarded against illness, and specific stars had power over the functioning of certain organs. As the Christian church became a more powerful force in Western civilization, this moral view of mankind became more pronounced; disease was the bodily manifestation of human sin, man’s transgressions, or the visitation of god’s punishment on the morally unclean. One has only to read the accounts of the Black Death in the 1300s (‘‘God’s judgment on a sinful humanity’’) to appreciate the lack of verifiable theories of causation, the unproven methods of cure, and the invocation of powerful moral forces that controlled not only medical practice but all thought and action in the Western world.

The philosophies of early healers and leaders of specific schools are well documented. Hippocrates (460–377 BC) systematized the medical outlook for the known world, described diseases and symptoms, and taught a theory of illness and disease based on an imbalance among the four humors. He laid out a daily regimen that relied on the curative powers of nature, cleanliness, a simple diet, and exercise. The medical perceptions of Galen of Pergamum (129–200 AD) set the pattern for the teaching and practice of Western medicine for over 1200 years. His views, especially on anatomy, influenced all instruction in medicine in the West until the late Middle Ages. Traditional Chinese medical teaching had much in common with the teachings of the Greek, Roman, Arab, and Hebrew physicians. The Chinese schools saw the importance of the ‘rhythm’ of life, the need to control and modify qi – human ‘life force’ or ‘vital energy’ – by manipulating the balance between yin and yang. Suitable balance, the curative powers of the natural world, and the workings of greater forces were essential in order to reduce impurities, eliminate symptoms, and increase ‘harmony.’ And although these tenets were perhaps excellent for improving human health, they were less relevant in controlling human illness.

With the rise of Islam in 800 AD, both the Arab world and the West were strongly influenced by teachers such as Al-Rhazi (865–925) and Avicenna (1010). For these teachers, spiritual integrity and physical health were strongly interrelated. Avicenna authored a number of books on medicine, including the Book of Healing and the 14-volume Canon of Medicine, which, along with the works of Galen, became the basis for medical teaching until well into the 1500s.

By the eleventh century – at the beginning of the so-called ‘High Middle Ages’ or Carolingian Renaissance – there occurred a marked revitalization in Western civilization accompanied by an improvement in the living standards in Europe; certain, more formal Western schools were created and began to exert a powerful influence on medical education. These medical schools were closer in form to the institutions we know today. The Schola Medica Salernitana based in Sicily taught a fusion of Arabic, Greek, Roman, and Hebrew medical theories and practices; medical works from around the world were translated, and students flocked to Salerno for instruction. The famous compendium of medical knowledge – Regimen Sanitatis Salernium – became part of teaching programs throughout the Western world.

The works of Vesalius (1564), a Belgian physician schooled at the universities of Paris and Padua, provided the first ‘modern’ approach to an understanding of the human body and its pathology. His anatomical studies, based on actual dissection of human cadavers (and perhaps influenced by the first anatomical drawings by Leonardo da Vinci in 1500), transformed the long-held views of Galen and the classical medical thinkers. Vesalius’ De humani corporis fabric calibri septem (1543) discarded most of Galen’s inaccuracies about the human body (including Adam’s missing rib) which had influenced Western medicine for centuries. His studies and subsequent teachings were the first step in challenging the mystical and unfounded medical beliefs of an earlier age; his observations created the early basis for an understanding of anatomy, the placement and possible functioning of human organs, and the foundations on which effective cures could be discovered, tested, and prescribed. By the end of the seventeenth century the stage was set for the transformation of medicine and medical teaching. Western man was absorbing and transmitting new ideas; individuals and societal institutions had begun to learn new ways of looking at the human body.

In reviewing the history of early medical education, it is difficult to accept that schools could exist based on teachings about supernatural forces, the malignancy of the gods, unproven remedies, suitable ‘balance,’ and without any experimental physiological or therapeutic basis. But a lack of understanding about specific causes or therapies does not automatically obviate an educational system; one teaches based on the knowledge available. Today, for instance, we have putative ‘causes’ for cancer; we teach and recommend a variety of therapies to ameliorate or cure malignancies, yet we have little, or conflicting, insight into the causes or remedies for the condition. Until the end of the Middle Ages, virtually all medicine was taught and practiced without any understanding of causation, progression, or therapy, and even a basic understanding of anatomy, chemistry, or biology was lacking.

Medical Education In Transition (1500–1800)

Toward the end of the Middle Ages in Europe (1500), medicine and medical teaching underwent a pronounced cultural and scientific transition. This transition, unlike the medical revolution of the 1800s, was not fueled by specific scientific advances, the introduction of new therapies, or by any dramatic revelations about the workings of the human body. It was part of a larger overall societal upheaval – The Renaissance – which saw an extraordinary flowering of all aspects of Western civilization. New initiatives in art, literature, warfare, trade, the economy, and virtually every human endeavor reflected a civilization-wide process which had its roots in altered perceptions of how individuals saw themselves, their institutions, and their relationship to nature, each other, and to God. In medical teaching, society began to embrace new theories about how the human body functioned, what disease processes could be observed and which problems of health and illness could be approached rationally. New or rejuvenated universities – buoyed by the invention of the printing press – took on central roles as centers for academic inquiry where man and his works, though still subordinate to higher powers, were nevertheless considered to be appropriate subjects for discourse and investigation. Medical schools appeared at Montpelier, Bologna, Siena, and elsewhere. Formalized courses in anatomy, therapeutics, and hygiene became more common, and accurate anatomical drawings (in place of the Church’s proscription on investigations involving dissection of the human body) – became an important part of teaching. Rembrandt van Rijn’s painting – ‘‘The Anatomy Lesson of Dr Nicholas Tulp’’ – commissioned by the Amsterdam Company of Surgeons in 1632, depicts observers immersed in the new learning process. The painting reveals the best of Renaissance philosophy and action; it is realistic, human-centered, and reflects the pragmatic transmission of new medical knowledge free from metaphysical abstraction and superstition.

The shift to a new way of approaching man and his potential did not occur rapidly. Although Renaissance theories and approaches touched many aspects of medical teaching, the majority of physicians were still schooled in the older, apprentice-based system based on classical theories and practice. The understanding of human physiology, pathology, and most medical conditions was miniscule. Even within the best teaching institutions, unproven theories and the use of arcane remedies and other valueless therapeutic measures, such as bleeding, purging, and blistering, continued to be taught by scholars and used by the majority.

By the mid-1650s – as a subset of the societal forces unleashed by the Renaissance and its sister movement, the Reformation – medical education underwent a more radical, accelerated change. In part, this change accompanied new discoveries in astronomy, physics, and biology by Western investigators – Newton, Galileo, Copernicus, Tycho Brahe, and others. The influence of the church waned and there was a new freedom – especially expressed in medical teaching – from the long-held ‘moral view’ of the universe. Again, this acceleration in the form and content of medical education did not occur in isolation; new approaches touched all parts of Western civilization; the later philosophies of the Age of Enlightenment supported the Renaissance revolution that investigative, human-initiated action could influence many aspects of world civilization. Systematic thinking – human reason reinforced by experimentation – determined the nature of humanity’s political institutions and the world’s ethical framework and could remold not only medical teaching but all aspects of the relationships between the individual, society, and the state.

In medical teaching, the study of human anatomy acquired major prominence. The capacity to accurately observe the structures of the human body through actual dissection was reinforced by Harvey’s investigations of the circulation of the blood in 1628 and van Leeuwenhoek’s observations with the microscope. Medical study moved beyond classical passive acceptance and observation of man’s body to the use of experimental and investigative techniques to determine the nature of both normal and pathological human function. The old medieval and classical order based on unproven hypotheses, mystical half-truths, an apprenticeship system, and often damaging therapies was under increasing attack. New knowledge demanded more time for instruction, more space to perform experimental and investigative work, and new liaisons among universities, clinical teaching and the rejuvenated hospitals. Other societal factors – increased wealth, population movements, and demands for improvements in military medicine – fostered the need for well-trained practitioners.

By the mid-1700s, despite the fact that there was as yet no firm therapeutic base for care and cure, formal medical teaching was incorporated into the structure of universities. Edinburgh and the German and French universities became centers of instruction for pupils throughout the Western world. Professional medical organizations took the lead in setting definable academic standards for the training of physicians. In 1760, the Company of Surgeons in England required a 4-year apprenticeship and 1 year of clinical study before certifying a practitioner. John Morgan, who created the first medical school in North America – The College of Philadelphia (now the University of Pennsylvania) Medical School – accepted that universities should offer formal courses in anatomy and physiology and, like many of his New World colleagues, sought instruction in Europe. In part, this incorporation of university-based instruction into the training of all physicians was a major step by which practitioners sought legitimacy for their profession; today’s credentialing and licensing has its roots in early medical participation in the offerings of the academic world. All human institutions, but especially those involved in teaching, underwent a major restructuring; new laws began to delineate between physicians (a profession), surgeons (a craft), and apothecaries (a trade). Medical teaching was starting to define its place within the accepted institutions of civilization.

By 1783, Philippe Pinel, the father of modern psychiatry, introduced a course of medical instruction in Paris based on specific preclinical and clinical courses, the creation of teaching wards, the careful selection of patients for instructional purposes, and the assignment of professors elected by competition rather than through academic ‘intrigue’. At its base was Pinel’s idea that effective teaching was dependent on improving the student’s judgment and relying on an appreciation of perceivable evidence rather than simple rote memorization. The first year of Pinel’s teaching program offered courses in anatomy, physiology, and hygiene; the second year, attending rounds and the assignment of specific patients to pupils; and the third year, clinical lessons with the student charged with reporting on each patient’s illness. The institutional framework for modern medical teaching was beginning to fall into place.

The eighteenth century in Western Europe saw the coming together of two major forces which initiated substantial change in medicine itself and within the world of medical teaching. The new view of man and his potential, and civilization’s ability to change old and create new political institutions to support human aspirations, was complemented by the beginnings of a ‘scientific revolution’ through which man and his investigative spirit could observe, test, understand, and modify the health of the individual and the functioning of society. Western Europe began the process of changing its medical instruction process, and institutions designed to foster new ideas were created and sustained.

Outside of Western Europe and North America, these pioneering influences were more muted; individuals and governments by and large continued to function in age old ways; traditional remedies, humanity’s relationship to authority figures and supernatural forces and the lack of mobility restricted significant questionings of the older order. Non-Western medicine continued in its role of ministering to the health of the people but it was not a large part of the revolutionary societal changes taking place in the West.

Abraham Flexner And The Rise Of The Western Medical School

The nineteenth century saw massive changes in both the form and content of medical education, culminating in the ‘‘Flexner Report’’ of 1910. Although it is popular to attribute to Abraham Flexner the creation and consolidation of the medical teaching process into the medical school of today, he was no more than the messenger; changes in the education system had been percolating within the medical profession in Europe, Canada, Australia, North America, and centers in India, Japan, and elsewhere for years. Nevertheless, the publication of the Carnegie Foundation’s Bulletin Number 4, capitalizing on the scientific advances of the nineteenth century, was a major step in codifying the structure, curriculum, and licensing mechanisms of the educational system that serves the medical profession and society today.

Acceptance of the recommendations of the Flexner Report was not inevitable. In the early 1800s, and despite significant scientific and institutional advances, the profession of medicine and its system of instruction and licensing was weak, troubled, and generally ineffective. Saddled by what, at best, could be described as ‘therapeutic confusion,’ physicians, no matter how well trained, were neither effective arbiters of man’s temporal health nor successful intermediaries between humans and the older, virtually discredited metaphysical theories. The profession of medicine was a marginal one; physicians, usually randomly educated, had few tools to comfort and cure mankind and no longer could make authoritative judgments about the external forces which influenced the relationships between man, God, and the perceived world. The French Revolution, the beginnings of the Industrial Age, the rise of the nation-state, and new social theories culminating in the works of Charles Darwin had reinforced the changing nature of world civilization. Institutional advances – especially in Europe – were substantial and the universities had pioneered academic changes in many disciplines, including the medical education curriculum. By 1858 England had created a single register for all medical practitioners and a professional council to coordinate medical education. Europe was moving toward an academic system in which a medical license was granted only to graduates of approved programs and only those who were licensed could practice medicine. The number of hospitals had increased, and encouraging new teaching opportunities and advances such as the stethoscope had helped delineate the signs and symptoms of specific diseases. The early 1800s saw a plethora of well-described disease entities – Pott’s disease (1779), Laennec’s cirrhosis (1812), Bright’s disease (1827) – but medical instruction in general continued on its descriptive and instructional pathway.

Despite significant academic and scientific advances, however, the major purpose of medicine – the amelioration or cure of human ills – remained locked in the theories and practice of the past. For the most part, rational therapeutic measures did not exist and, except for Jenner’s 1790 process of vaccination, there were few improvements in the physician’s capacity to heal. World explorations, population movements, and new and more destructive methods of warfare had brought increased threats to health; the life-ending effects of malaria, tuberculosis, leprosy, malnutrition, smallpox, and battlefield wounds were increasingly apparent. Industrialization, new work pressures, and a growing and mobile population had compounded the medical and public health problems of childbirth, nutrition, and the provision of safe food and water. Yet even the best-trained physicians – although they could observe and describe illness well – had few ways to improve human well-being. Theories and ‘schools’ (homeopathy, osteopathy, etc.) of cure abounded, but the prevailing therapeutic skepticism surrounding clinical medicine led some physicians to take a new interest in social medicine, public health, and the application of epidemiology to health problems. In a number of countries, the imposition of government standards on the provision of good water, adequate housing, and improved conditions in the workplace were instituted. Yet for many – doctors and patients alike – medicine was a cipher; Edward Bates, attorney general of the United States under Abraham Lincoln, commented on the prevailing medical ignorance and noted that ‘‘no two of them [physicians] agree with each other and no one agrees with himself two weeks at a time’’ (Goodwin, 2005:67). The view that ‘‘a random patient with a random disease consulting a doctor chosen at random stood only a 50:50 chance of benefiting from the encounter’, accurately encompassed the public’s pre-20th century view of medicine (Gregg, 1956).

It is impossible to divorce today’s medical education process from the scientific revolution of the 1800s. To be sure, larger cultural forces had created both the institutions and social climate in which medical schools (and other institutions) might play a major role in improving societal conditions. Nevertheless, medicine lacked the theories and practical methods to actually improve health. The medical discoveries of the nineteenth century altered the whole framework of healing and new advances were made by Chadwick (public health, 1834, 1842, 1848), Morton and Simpson (surgical anesthesia, 1847), Semmelweis (puerperal fever, 1847), Snow (transmission of cholera, 1853), Pasteur (germ theory of disease, 1857), Lister (sterile surgery, 1867), Koch (infectious diseases, 1876), Laveran (malaria, 1880), Roentgen (X-ray, 1895), Ehrlich (chemotherapy, 1897), and Landsteiner (blood groups, 1900). These and others transformed the philosophy and practice of medicine and medical education; they laid the cornerstone of today’s teaching and cure and created the scientific basis by which the well-trained physician could actually benefit the individual and society. Anesthesia and sterilization gave practitioners, for the first time, ‘‘the ability to enter the body safely’’; surgical practice expanded exponentially. New diagnostic methods complemented the therapeutic advances, and, in effect, nineteenth century physicians and scientists created, modified, and expanded the ‘‘scientific foundations of today’s clinical medicine’’ (Flexner, 1910).

The new discoveries were complemented in Europe by governmental measures to institute ‘sickness’ or social insurance. The academic administrative changes defined by Flexner requiring specific courses for entrance, a standardized curriculum, and institutional and governmental certification of proficiency created the form and institutions of modern medical instruction and practice. Competency could be compared against known guidelines, many instructors were full time, examination results were verifiable and – most importantly – a physician’s credentials could be reviewed and evaluated by the public. The German, French, and a few U.S. universities had already pioneered modern methods of teaching – especially in the clinical sciences – and there were standardized courses at both the basic and clinical levels. National physicians’ organizations, especially in England and the United States, had advocated academic systems that required university-linked, science-based premedical requirements, a standard curriculum encompassing anatomy, physiology, pathology, chemistry, and bacteriology, to be followed by clinical instruction and licensure based on certification of competence by external reviewers. Adequate financial support for the school was essential, and the physician’s responsibility to the community was defined. In essence, Flexner and the experience of the European universities and the British and American medical societies introduced a standardized systematic application of the scientific method through defined medical training and drastically improved the quality of medical education.

World Medical Education At The End Of The Twentieth Century

By the late twentieth century the influence of the ‘Flexnerian revolution’ on medical education had spread globally. Although traditional healers continued to teach and practice their skills in many countries such as China, the Western, allopathic medical system and its teaching programs became the premier approach to health worldwide. Traditional public health measures continued to play a major role in the improvement of human health and in the eradication (smallpox) or decline of many diseases, but the hegemony of the medical approach and medical schools was paramount. Reinforced by an expanding governmental role in health, closer links with private industry, a dramatic increase in research funding, the discovery of new therapies and techniques (insulin, antibiotics, the structure and purpose of DNA, surgical and transplantation methods, etc.) and a generalized public perception that science-based medicine held the answer to human health problems, medicine was king. Once again, as with earlier advances in the medical education process, the expansion of the Western medical school model reflected larger changes in world society: the decline of colonialism, the rise of nationalism, the improvement in living standards, the changed venues of many types of health care, new methods of communication, and particularly the overall globalization of the world economy.

Today, the nature of many of the schools outside Europe generally follows the patterns set by former colonial overseers (Britain, Spain, France, Germany/ Netherlands, or the United States), and as of late 2006 there were reported to be 1931 medical schools worldwide (FAIMER, 2006). Over 50% of the schools began instruction after 1961, and 75% of the institutions in the World Health Organization’s (WHO) Africa, Eastern Mediterranean, and South East Asian Regions came into being after 1980. The United States alone created over 40 new schools between 1960 and 1980. These schools are often linked to universities, and English is the predominant language of instruction. The duration of the medical education process falls into two categories. Under the European system, approximately 6.5 years of medical education is required and premedical (biology, chemistry, physics, etc.) training is incorporated into the course of instruction. Under the American system, most medical schools require a previous bachelor’s degree; the student enrolls in medical school for 4 years, but premedical courses are completed prior to entrance. In most countries, an entrance test is required; these examinations can be extremely rigorous, and only a predetermined number, based on test results, are admitted. The curriculum follows the ‘two year preclinical, two year clinical’ model, although there are modified and overlapping courses within individual schools. Virtually all schools require some type of national or school-administered certifying examination before graduation, but there is no basic global medical yardstick to measure medical competence. There are a minimum of public health-oriented courses and little clinical education outside tertiary hospitals. Curricular differences between schools depend on local conditions, the relative wealth of the school, research priorities, and the manpower requirements of specific national health systems, as well as on the interests, availability, and competence of the faculty. The requirement for some form of scholarly dissertation is variable; community service is not necessarily encouraged or required.

However, although the form of the Western medical teaching institution has been exported relatively successfully to other countries, the ideal of a universal, rigorous, standardized curriculum that ensures competency on graduation has not always been realized. In some countries, criteria for admission may be questionable, the rigor of the courses does not always meet basic requirements, and objective certification of medical knowledge may be lacking. Laboratory facilities, libraries, and methods of instruction may be substandard; proprietary medical schools of questionable quality still exist; and methods to verify that all graduating physicians meet certain standards requires significant attention.

In addition to the problems of quality control and the need to meet regional quotas for physicians, the schools themselves and the international bodies created to review medical instruction are still wrestling with many other educational problems. Should the medical curriculum use the problem-based (PBL) approach or is the lecture format a better instructional tool? Should courses teach in the traditional, discipline-based way or use the integrated method? How should clinical time be allocated between tertiary care hospitals and other locations? How do students deal with the overload of knowledge, the demands and opportunities presented by the computer age, and the increasing emphasis on preventive medicine? How much research? How much memorization? What about continuing education? Disease-based or person-based? Care or cure? Problems of future specialization, research careers, ethics, and the changing demands of the outside world are affecting medical schools everywhere. And perhaps overriding specific concerns about the structure, financing, and curriculum of today’s medical school is the question of how to introduce acceptable global standards so that the credentials of physicians moving between countries may be rigorously evaluated. There is currently no global mechanism by which society, governments, or the individual patient can be assured that any given graduate meets certain qualifications of medical competence.

There is little question that the Flexnerian model of medical education has served the world well. Piggybacked on changing social values and the scientific revolution of the nineteenth century, the medical school in its current form is a system of extraordinary political, economic, and social power. As countries around the world improve their economic political and social base, the medical school model would appear to answer the need for improved health care. The ‘good’ of the current education approach is obvious: the introduction of consistent, enforceable standards; an academic, university-linked enterprise which fulfills the public’s demand for a high measure of individual care, cure, and new research initiatives; a usually well-funded institution which provides a focus for many health-related initiatives; and the production of a well-trained graduate who is professionally competent and whose credentials can be evaluated. The ‘bad’ of the system is increasingly apparent: a ‘straitjacket’ approach to education, a disregard for nonmedical factors which affect human health, a failure to adapt well to global changes and population movements, an inability to deal effectively with the hypertrophied knowledge base (the schools are finding that they can no longer just ‘add one more course’), ‘medieval’ teaching venues, new government initiatives in health, the cost of the endeavor, and, perhaps most significant, the investment into and the support of an educational system which may no longer well serve the changed health needs of society. In many schools, there is a growing dichotomy between research and patient care agendas, a disconnect between the school and the community and often between the academic teacher and the practicing physician, and an increasingly problematic relationship between the social, patient-centered role of the medical school and its position as a profit-making enterprise. Large amounts of money from government, private industry, and patient care sources now flow through medical schools, and business and service functions are more intertwined. In some areas, the medical ‘center’, with its myriad of conflicting missions, has engulfed the medical school. And although a certain lip service is paid to community service, in most instances the schools are divorced from the community’s needs and, indeed, are barely a part of the larger university. The training process is extraordinarily expensive and places inordinate demands on students and instructors, and the increasing specialization of graduates tends to dilute the basic curriculum.

The problems of the world’s medical schools today are not dissimilar to the situation facing U.S. and, indeed, worldwide medical education at the turn of the last century when the Flexner Report set the stage for extensive changes in medical education. Global pressures, the varied needs of specific geographical areas, the correct delineation of medical standards, and other concerns noted above are prompting some in the medical establishment to ask for the creation of some kind of a modern Flexner Report. It is beyond the scope of this research paper to review objectively the necessity for such a report or to outline the specific dilemmas of individual schools. Suffice it to say that world medical education is in substantial need of reevaluation and significant change if it is going to continue to meet successfully the changing health needs of individuals and communities.

Behind these problems within our medical education system lies a major, though rarely voiced, question: is today’s medical school – heavily supported by society – really giving society sufficient value for money? Are the schools now training the best type of health practitioner? For the most part, these practitioners are educated for action against disease rather than action for health. Some type of individual-based, cure-oriented system of medical education is essential to humankind, but our current mechanism is expensive, academically cumbersome, inefficient, and relatively unresponsive to the needs of society. A great deal of time, energy, and money is currently poured into an academic enterprise which may not be providing the kind of ‘health’ that the world now needs.

‘‘There is a lessening of public confidence in the good of the enterprise’’ (Korn, 1996) and increasing concerns about the direction and effectiveness of contemporary medicine. Changed academic approaches to health-care education are required.

Physicians For The Twenty-First Century

What then should be the role of medical schools in the twenty-first century? The dilemma that confronts medical education today is based in large part on the substantial changes occurring within world society. The rising standard of living, globalization, new definitions of health, changed health systems, a market-based economy, and many other factors are placing untoward demands on our health infrastructure. The ‘‘Flexnerian’’ model came into being over a century ago when disease patterns, longevity, and threats to human well-being were different. How should we be educating physicians for today’s world, and what paths are open for a reorientation of our medical schools? Two possible approaches to improving the educational process (not necessarily mutually independent) could be considered.

If one assumes that the current science-based model with strictly defined admission guidelines, a marked emphasis on proficiency in biomedical subjects, required preclinical and clinical courses, and licensing based on a standard certification process – a concern already addressed by the World Federation for Medical Education (WFME) and WHO – best serves the health needs of the public, then relatively moderate changes within the current teaching structure should produce more competent practitioners. This approach asks that the world agree on certain medical standards and respond to the current problems within our medical education system.

In general, this initiative speaks to changes in the form rather than in the content of medical education. It focuses the majority of its resources on improvements – academic and otherwise – in medical care. It asks to take the current biomedically oriented, individual-based school format, introduce new academic rigor to schools worldwide, stress more intellectual discovery and critical thinking, add new tools including increased use of informatics and simulated medical situations, stress problem based learning (PBL), institute clinical work earlier, and so forth, and create a more modern version of the old model. For some, medical schools require more leadership and a business approach to better functioning; for others evidence-based practice, systems approaches, and quality improvement need to be pursued. The permutations are legion but the intrinsic good of the basic enterprise is not questioned; a physician based, individual curative-care-centered model and ‘action against disease’ still holds the key to human health.

The critics of such change see this approach as maintaining medical schools and their curriculum very much ‘as is’ and claim that such a path would encourage the schools to continue to follow purely scientific and technological interests without significant attention to the urgent health needs of society.

The second approach is more radical and asks serious questions not only about the basic form of today’s medical school but also as to whether the medical school as presently constituted, or even modified, is still capable of adequately preparing graduates to be significant guardians of world health. This view takes a pro-health rather than an antidisease stand in reorienting medical teaching programs. Its advocates see a community-oriented medical school closely integrated with the health needs of the community and committed to teaching social, economic, and political, as well as biomedical, approaches to health. Better health, not just better medicine, is the goal, and social accountability is the key to a good medical education. These medical schools would have much closer links to each country’s health-care delivery system. They, too, would demand certain global standards, but these standards would require evidence of social responsiveness. Professionals from a variety of disciplines – medicine, nursing, public health, etc. – would be trained to work together as teams. Traditional medicine – now partially subsumed under the concepts surrounding the holistic approach to healing – would be included in teaching programs around the world.

Some amalgamation of the two views might see a two-tiered system of health education; ‘super docs’ would receive largely biomedical training – perhaps at specified medical schools – under a modified Western, science-oriented format. They would be educated to pursue a research agenda or to become skilled at the specialized, technological, medical services needed by society today. Other physicians would enter a more community-oriented track or school that would stress the ‘5-star’ aspects of health care. These population-based physicians – not unlike the best primary care physicians today – would be trained not only as care providers, but also as decision makers, communicators, health advisors, community leaders, and team members.

Because of the centrality of medical schools in the education of our professional care givers, the future of medical education is of great concern to all interested in the overall health of individuals and communities. There is increasing malaise today that our medical education system has gone too far in creating and sustaining a ‘sovereign profession,’ divorced from the health needs of the public, rather than educating a cohort of compassionate healers to serve society. Over the centuries, the medical education system has shown itself to be extraordinarily adept at absorbing and transmitting new knowledge, modifying its format to follow cultural changes, and sustaining institutions which meet its professional responsibilities and society’s goals. In the past, traditional, apprentice based medicine supported human health needs; with scientific discoveries and suitable academic institutions, the Western model then took the central role in educating physicians throughout the world. Today, largely in response to changes in society, the medical school is seeking modified forms and new directions.

Bibliography:

  1. Flexner A (1910) Medical education in the United States and Canada. The Carnegie Foundation Bulletin. no. 4, New York. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2567554/pdf/12163926.pdf
  2. Foundation for the Advancement of International Medical Education and Research (FAIMER) (2006) https://www.faimer.org/
  3. Goodwin DK (2005) Team of Rivals, The Political Genius of Abraham Lincoln. New York: Simon and Schuster.
  4. Gregg A (1956) Challenges to Contemporary Medicine. New York: Columbia University Press.
  5. Korn D (1996) Re-engineering academic medical centers: Re-organizing academic values. Academic Medicine 71: 1033–1043.
  6. Boelen C (2002) A new paradigm for medical schools a century after Flexner’s report. Bulletin of the World Health Organization 80: 523–524.
  7. Bonner TN (1995) Becoming a Physician. Medical Education in Britain, France, Germany and the United States 1750–1945. Baltimore, MD: Johns Hopkins University Press.
  8. Boelen C and Boyer M (n.d.) A View of the World’s Medical Schools – Defining New Roles. http://www.the-networktufh.org/publications resources/furtherreading.asp (accessed Sept. 2007).
  9. Cooke M, Irby D, Sullivan W, and Ludmerer K (2006) American medical education 100 years after the Flexner report. New England Journal of Medicine 355: 1339–1344.
  10. Eva KW Medical Education (periodical). Edinburgh, UK: Blackwell Publishing.
  11. Garrison FH (1921) An Introduction to the History of Medicine. New York: W.B.Saunders.
  12. Greenspan RE (2006) Medicine. Perspectives in History and Art. Alexandria, VA: Ponteverde Press.
  13. Harden R Medical Teacher (periodical). An International Journal of Education in the Health Sciences. Dundee, UK.
  14. Kanter SL Academic Medicine (periodical). Journal of the Association of American Medical Colleges. Washington, DC: AAMC.
  15. Ludmerer K (1999) A Time to Heal. American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press.
  16. Sajid AW (ed.) (1994) International Handbook of Medical Education. Westport, CT: Greenwood Press.
  17. Starr P (1982) The Social Transformation of American Medicine. New York: Basic Books.
Yellow Fever Research Paper
World Health Organization Research Paper

ORDER HIGH QUALITY CUSTOM PAPER


Always on-time

Plagiarism-Free

100% Confidentiality
Special offer! Get discount 10% for the first order. Promo code: cd1a428655