Students of history will find meager resemblance between the medical schools of the 1800s and their modern counterparts. Low standards and practices were rampant in 19th-century medicine. The term "quack doctor" gained common coinage as a pejorative during this period. Even highly ranked Harvard (founded in 1782) and Yale (established in 1810) were forced to reinvent themselves to survive the purge of sub-par medical schools cited as "indescribably foul" in an infamous study. Medical education was about to experience a radical transformation.

The Proprietary and Apprenticeship System

The proprietary medical school lasted well into the early 1900s.
The proprietary medical school lasted well into the early 1900s.

An education in Scotland or London -- the destinations of choice -- entailed an expensive overseas journey few aspiring doctors could afford. The market responded with the proprietary school. This was a churning system, equivalent to the diploma mills of today. "A school that began in October would graduate a class the next spring," writes Abraham Flexner. Academic prerequisites (premed) were nonexistent. Entry was determined by who could pay, and the fees were always paid to the owner-physician. These for-profit proprietary medical schools flourished during the 1800s, and the apprenticeship model, where students were paired with a practicing doctor -- on-the-job training -- was the established route to become a medical professional.

Medical School 1840s Style

Oberlin College and Vassar were two medical schools that began admitting women.
Oberlin College and Vassar were two medical schools that began admitting women.

The didactic method was the rage of the day. Instead of hands-on clinical training, students were taught by rote through a series of lectures, four each day, often clocking in at eight hours total. The entire course, from admission to graduation, was two 16-week semesters. One in winter and the second in spring, without a shred of difference; both terms covered the same material. A peek back in time at a sample syllabus reveals core work in biology, pathology, chemistry, obstetrics. But these classes weren't even graded. A systematic course of study with nominal variation from college to college simply didn't exist. There was no "medical license" issued or even an exiting exam. During this education blight the American Medical Association was founded at a doctor's congress in 1847.

The Johns Hopkins Model

As late as 1891 educator John Shaw Billings believed the public didn't care how their doctors were educated.
As late as 1891 educator John Shaw Billings believed the public didn't care how their doctors were educated.

In 1876 the Johns Hopkins Medical School debuted a model that would fundamentally change the way doctors were educated. Admissions now required a degree. Rote learning was retired and replaced with laboratory, clinical work and core science classes. The curriculum expanded to four years. The research arm become prominent and integrated into the medical school's vision. The medical school aligned with a research department was replicated across the country.

Standardization: The Flexner Report

When Abraham Flexner issued his scathing report in 1910, Johns Hopkins was the ideal to emulate. Sponsored by the Carnegie Foundation and endorsed by the nascent American Medical Association, the Flexner report examined 155 medical schools, the first survey ever committed. The findings of his exhaustive catalog included lurid descriptions of finding "a room containing a putrid corpse, several of the members of which had been hacked off," that sent shock waves through the medical profession. His recommendations to reform medical education with uniform admissions, coursework and graduation requirements, and licensing regulated by the state, were adopted broadly. The demise of the proprietary medical school soon followed.