The Great Conversation:
Medicine

By Gabriel Blanchard

Medical science has changed immensely in the last few thousand years—but not always in the ways we might assume.

The word “doctor” was originally a synonym for “teacher.” It represented a level of mastery in a subject that would qualify the holder to teach that subject to others; in academic contexts this is still what it means, but of course, when we tell someone to go to the doctor, we do not mean someone with an advanced degree in archæology or linguistics—we mean a doctor of medicine.

Curiously enough, and rather like the term philosophy, the category of medicine used to be defined much more broadly as well. After mastering the seven liberal arts (grammar, logic, rhetoric, arithmetic, geometry, astronomy, and music), university students in the Middle Ages and the Renaissance would then typically turn to one of three specializations: theology, law, or medicine. Theology covers much the same ground now that it did then, but law embraced human activity in general, thus encompassing fields like economics or moral philosophy that we now consider related but distinct; and medicine, while it included “physic” (i.e., the subject matter of physicians), was almost a synonym for the natural sciences, especially though not solely life sciences.

The study of medicine as we mean the term today is, however, exceedingly ancient. Hippocrates of Cos, who lived in the late fifth and early fourth centuries BC, was already an heir to more than a thousand years of medical theory and practice, stretching at least as far back as the Egyptian Old Kingdom sage Imhotep. Hippocrates is most remembered for the Hippocratic Oath, a vow of ethical conduct still used by some medical schools to this day—typically in a modified form that does not invoke the train of health-related pagan divinities. Ideas like doctor-patient confidentiality and categorical refusal to do harm are already embodied in the Hippocratic Oath.

Treatments could also be surprisingly sophisticated in the ancient world, and even in prehistoric times: the earliest evidence of trepanation (removing part of the skull to observe and treat intracranial issues) is about seven thousand years old. Surgeries to treat kidney stones and even cataracts were common enough to merit mention in the Code of Hammurabi in the eighteenth century BC, and sources from ancient India and pre-colonial Africa indicate similar knowledge. Sanitary practices left something to be desired; yet even here, the ancients had a rudimentary knowledge of antiseptics, and Medieval medicine, contrary to popular belief, recommended frequent bathing for its health benefits.

              What, is Brutus sick,
And will he steal out of his wholesome bed,
To dare the vile contagion of the night
And tempt the rheumy and unpurged air
To add unto his sickness?

But while medical ethics have remained fairly consistent across the centuries, and many practices have a long history behind them, medical theory has changed substantially. Most illnesses were thought to be caused by an imbalance of the “four humors,” a classification of bodily fluids (blood, yellow bile, black bile, and phlegm) that supposedly governed both physical and mental health. These imbalances were attributed to all sorts of causes—from toxic air emerging from rotting organic matter (“miasma”), to unfortunate astrological conjunctions of the planets. But attempts to rebalance the humors frequently led to useless or even disastrous practices, like bloodletting, which almost invariably harmed the patient.

Increasingly precise knowledge of the human body allowed slow but steady advances in medicine, particularly with the study of microorganisms starting from their discovery in the seventeenth century. Fr. Athanasius Kircher, a German Jesuit, wrote in 1646 about his study of various substances under a microscope, “Who would believe that vinegar and milk abound with an innumerable multitude of worms?”, evidently describing bacteria; the Dutch scientist Anton van Leeuwenhoek recorded similar observations in the 1670s. Over the next two hundred years, the germ theory of disease took shape.

One of the most famous applications of germ theory was, of course, vaccination. Inoculating healthy people was another pre-existing practice, recorded in China from at least the sixteenth century, but it carried a serious risk of infection. Edward Jenner, an eighteenth-century English doctor, determined that it was possible to achieve equally good protection from disease by inoculating patients with a related but weaker disease—specifically using the mild cowpox infection to ward off smallpox (hence the term vaccine, from the Latin vacca “cow”). Another major advance came from Louis Pasteur, who discovered a method to purify food and drink from microorganisms, a method we still refer to as pasteurization. Thanks to this process, food and drink which might otherwise spoil in a day or two remain safe to consume for weeks or even months at a time. In the ongoing challenges of the COVID-19 pandemic, we have good cause to celebrate men like Jenner and Pasteur.

Suggested reading:
Hippocrates, Ancient Medicine
Avicenna, The Canon of Medicine
William Harvey, On the Motion of the Heart and Blood in Animals
Daniel Defoe, Journal of the Plague Year
Edward Jenner, The Variolae Vaccine
C. S. Lewis, The Discarded Image

___________________________________________________________________________________

If you enjoyed this piece, try this author profile of Jorge Luis Borges, this “Great Conversation” post on the ideas of memory and imagination, or this student essay on sibling relationships in Pride and Prejudice. And be sure to check out our weekly podcast, Anchored, hosted by our CEO and founder Jeremy Tate.

Gabriel Blanchard is a proud uncle of seven nephews, a freelance author, and an editor and staff writer for CLT. He lives in Baltimore.

Share this post:
Facebook
Twitter
LinkedIn
Email
Scroll to Top