Medical Student JAMA
JAMA. 2004;291(17):2139-2140. doi: 10.1001/jama.291.17.2139

The Flexner Report and the Standardization of American Medical Education

  1. Andrew H. Beck
  1. Brown Medical School, Providence, RI

If the sick are to reap the full benefit of recent progress in medicine, a more uniformly arduous and expensive medical education is demanded.—Abraham Flexner1

Medical education in the United States today is strikingly standardized and demanding. It was not always so. Prior to the widespread implementation of educational reforms, medical training was highly variable and frequently inadequate. It was not until the early decades of the 20th century that a "uniformly arduous and expensive" system of medical education was instituted nationally.

In the 19th century, most medical education in the United States was administered through 1 of 3 basic systems: an apprenticeship system, in which students received hands-on instruction from a local practitioner; a proprietary school system, in which groups of students attended a course of lectures from physicians who owned the medical college; or a university system, in which students received some combination of didactic and clinical training at university-affiliated lecture halls and hospitals. These medical schools taught diverse types of medicine, such as scientific, osteopathic, homeopathic, chiropractic, eclectic, physiomedical, botanical, and Thomsonian.2 In addition, wealthy and industrious medical students supplemented their education with clinical and laboratory training in the hospitals and universities of Europe, primarily in England, Scotland, France, and Germany. Because of the heterogeneity of educational experiences and the paucity of licensing examinations, physicians in America at the turn of the 20th century varied tremendously in their medical knowledge, therapeutic philosophies, and aptitudes for healing the sick.3-4

Throughout the second half of the 19th century, the American Medical Association (AMA) lobbied for the standardization of American medical education. These efforts were largely unsuccessful, both because political traditions in America dissuaded national regulation of professions and because the American public and much of the medical profession were not convinced that any particular brand of medical education was significantly superior to any other. "The great mass of the public," declared the medical educator John Shaw Billings in 1891, "know little and care less about the details of professional education . . . . The popular feeling is that in a free country every one should have the right to follow any occupation he likes, and employ for any purpose any one whom he selects, and that each party must take the consequences."5

However, by the turn of the 20th century, a series of scientific breakthroughs had altered the values held by the public and the medical profession: clinical and laboratory research had exposed the irrationality of "heroic" treatments (such as blistering, bleeding, and purging) and had proven the therapeutic efficacy and rational scientific basis of modern practices, such as antiseptic surgery, vaccination, and public sanitation. Most of the public and virtually all physicians now believed in the superiority of scientific medicine.2 Educators at leading US medical schools now contended that the path toward mastering the analytical skills required to practice scientific medicine lay not with the memorization of accepted truths but with the systematic application of the scientific method throughout medical training. They asserted that students should spend most of their time at medical school actively engaged in laboratory experimentation and hands-on care at the bedside.3

The AMA sought to eliminate schools that failed to adopt this rigorous brand of systematized, experiential medical education. "It is to be hoped that with higher standards universally applied their number will soon be adequately reduced, and that only the fittest will survive," the editors of JAMA declared in 1901.6 In 1904, the AMA created the Council on Medical Education (CME) to promote the restructuring of US medical education. At its first annual conference, the CME outlined its 2 major reform initiatives: standardization of preliminary education requirements for entry into medical school and national implementation of an "ideal" medical curriculum, consisting of 2 years of training in laboratory sciences followed by 2 years of clinical rotations in a teaching hospital.7 In 1908, the CME planned to undertake a survey of medical education in the United States to promote the organization's reformist agenda and to hasten the elimination of medical schools that failed to adopt the CME's standards. The CME requested the Carnegie Foundation for the Advancement of Teaching to lead the undertaking. Carnegie Foundation president Henry Pritchett, a staunch advocate of medical school reform, chose the schoolmaster and educational theorist Abraham Flexner to head the survey.8-9

Over the course of 18 months, Flexner visited all 155 US medical schools. He examined 5 principle areas at each school: entrance requirements, size and training of the faculty, size of endowment and tuition, quality of laboratories, and availability of a teaching hospital whose physicians and surgeons would serve as clinical teachers. Flexner's report showed that although most of the nation's medical schools claimed to adhere to progressive, scientific principles of medical education, only a very few had the financial resources, laboratory and hospital facilities, and highly skilled teaching staff necessary to apply this demanding form of education. Flexner noted, "We have indeed in America medical practitioners not inferior to the best elsewhere; but there is probably no other country in the world in which there is so great a distance and so fatal a difference between the best, the average, and the worst." He maintained that to standardize the quality of all medical schools to that of America's "best" schools, the nation must stop wasting its social and economic resources on financially strapped commercial schools that were unable to provide the costly, time-consuming, economically unprofitable ideal standard of medical education being offered at the leading US medical schools: "The point now to aim at is the development of the requisite number of properly supported institutions and the speedy demise of all others."1

For decades, physicians had promoted medical education reform as a means to increase professional status. Flexner's unique contribution was to promote educational reform as a public health measure. He argued that the business ethic that governed proprietary medical schools was incompatible with the progressive academic values necessary for socially useful medical education. "Such exploitation of medical education," Flexner declared, "is strangely inconsistent with the social aspects of medical practice. The overwhelming importance of preventive medicine, sanitation, and public health indicates that in modern life the medical profession is an organ differentiated by society for its highest purposes, not a business to be exploited."1 He maintained that the state government is the proper instrument for regulating medical education, because social welfare is inextricably linked to the quality of the nation's physicians: "The right of the state to deal with the entire subject in its own interest can assuredly not be gainsaid. The physician is a social instrument."1

In the 1910s, state licensing boards began to force medical schools across the United States to implement heightened admission standards and stricter curriculum requirements.10 In 1912, a group of licensing boards formed the Federation of State Medical Boards, which voluntarily agreed to base its accreditation policies on academic standards determined by the AMA's CME. Consequently, the CME's decisions "came to have the force of law."11 During these same years, philanthropic foundations began making large contributions to promote medical research and education at a select group of leading medical universities.12-13 By the 1930s, the combined efforts of state licensing boards, philanthropic foundations, and the AMA's CME resulted in the eradication of America's proprietary medical colleges and the standardization of the laboratory- and hospital-based research medical university model that Flexner advocated in his report.3

Although these reforms raised the quality of medical education in the United States, it concurrently caused a disproportionate reduction in the number of physicians serving disadvantaged communities: most small, rural medical colleges and all but 2 African American medical colleges were forced to close, leaving in their wake impoverished areas with far too few physicians.11, 14 Furthermore, the increased entrance requirements and extended course of study now required to become a physician promoted "professional elitism" and inhibited the economically underprivileged from pursuing careers in medicine.15

Medical schools continue to struggle to overcome these untoward effects of the standardization of American medical education.16-17 To the present day, all accredited US medical schools strive to apply Flexner's "uniformly arduous and expensive" brand of medical education, though the rising costs of health care have forced many schools to make curricular compromises and to form corporate alliances as they attempt to balance academic ideals with economic and social responsibilities.18-21

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
« Previous | Next Article »Table of Contents

Navigate This Article

More in JAMA & Archives Journals