Why the ACA Won’t Do Squat: Part II — The History of American Physical Medical Practices

Why the ACA Won’t Do Squat: Part II — The History of American Physical Medical Practices

For Part I of this series, go here.

 

In Part I, we discussed who the players were in the modern American healthcare system and a little bit of history behind them. In this part, we’re going to dive in deep into the history of American medical practices starting with the state of affairs during the pre-Revolutionary era and ending with the state of affairs as they were just prior to the 2008 Presidential election and the subsequent passing of the ACA. I do this so that we can be certain that everyone is on the same page. Again, in this part, I’m making no judgement calls on if something was “good” or “bad.” Nor am I suggesting an ideal manner for things to be. Once again, I’m not an ideological purist. I’m a pragmatist at my core and am more interested in understanding how things really work (not always how they are “supposed” to work) and trying to find a method that brings about an optimal result even when, at certain margins, that result might seem “unfair.”

 

So, let’s hop in this TARDIS I nicked from the Doctor when he wasn’t looking (silly Time Lord) and set the coordinates for the British American colonies in the mid-eighteenth century (the 1700s).

 

Medicine in the American colonies (and in most of the rest of the Western world) in the 1700s was little different to our modern eyes than superstition. Germ theory was not even a dream in the mind of the most forward-thinking doctor. Sterilization of instruments simply did not exist. Illnesses were blamed on bad air, bad humors in the blood, or even witchcraft. The only medical treatments we would recognize from this era as being useful were wound treatment (binding and stitching), amputation (not always conducted under anesthesia), and childbirth (which was done in a most barbaric and traumatizing manner!1). During the pre-Industrial era, if a person fell ill with strep throat, Typhoid fever, Scarlet fever, chicken pox, small pox, measles, mumps, Rubella, tetanus, polio, or pretty much any illness that involved a fever, they were believed to have taken “bad air.” The normal course of treatment was for a doctor to bleed them by cutting open an artery and letting the “bad blood” that had been created by the “bad air” or “bad humor” out. Leeches were also employed in helping to rid the sick person of these bad humors in the blood. When that treatment didn’t work — and especially in the cases of consumption (what we now call tuberculous) — medical professionals suggested that a change in climate was necessary2. This is why so many people would move from a colder climate to a warmer and wetter climate. This change might prolong their life for a few months or years since the body no longer had to battle the chill as well as the disease, but it brought about no cure. The ordained clergy of the church (Catholic priests or Protestant ministers) were often called upon to attend to the sick or dying in hopes that a benevolent God would show mercy through their prayers and petitions and restore the sickly to health. In other cases, trying to “cleanse” the air before it entered the body using perfumed handkerchiefs tied around the mouth and nose was considered a good form of treatment. And, when these handkerchiefs were soaked in brandy or another alcoholic substance3 this might actually have helped — albeit more by accident than knowledgeable design.

 

Doctors and their apprentices during this era also depended heavily on “illicit knowledge” in order to advance their art. Autopsies and the dissection of corpses was all but completely forbidden by Christian institutions. However, some men, desiring to learn more and to make medicine into a science, dug up cadavers and dissected them in an attempt to learn more about the inner workings of the human body4. Their teachings were handed down through the universities and the master-apprentice system. It was from here that we see the beginnings of an understanding of the human body and its organs that would later play a vital part in surgical procedures. Also, in the latter part of this era and into the Industrial Revolution, doctors were willing to experiment a bit (not always ethically, though). Treatments and surgeries for conditions like clubfoot were tried until someone hit on something that seemed to work more often than not. Bear in mind, again, that sterilization and germ theory did not exist. Many good doctors were discouraged when their procedures wound up resulting in a full amputation or death because of infection because they did not understand that they or their instruments had contaminated the site! Many doctors also felt at a loss to explain the deaths of laboring women or their children even though the doctors had done everything “right.” Many times, the doctor had been visiting or working with an infected patient or corpse and went immediately to the child-bed without washing their hands. Medicine in this era was primitive by our modern standards. Remember that before you judge!

 

The Industrial Revolution brought with it not just a tendency of people to flock to a city or factory area for work but also a slightly better understanding of sanitation and sewage/water treatment5. Once again, these standards were barbaric compared to our own and, in some ways, epidemiology6 had been known and studied prior to this era, but the Industrial Revolution did put doctors and universities together with a lot of people and contagion, planting the seeds for later understanding of disease, contagion, germ theory, and the other bases of modern medicine. During the latter part of the Industrial Revolution, especially during and after the American Civil War, doctors became more adept at performing amputations and understood a bit better the stress that surgery inflicted on the human body. Though this was mostly hard-won knowledge by the Southern doctors forced to perform amputations without opium or any other painkillers, the understanding that pain played a role in survival and recovery was part of this era. Another hard-won piece of knowledge from the late 1800s was childbirth. Forceps and the understanding of a woman’s hip width (and thus, the advising the women with narrow hips not have children) contributed to a slightly higher rate of survival without damage in childbirth. As the Industrial Era progressed into the twentieth century and germ theory, pasteurization, and other sterilization practices became more widely adopted7, medicine began to more closely resemble what we know today. Vaccines became more widespread as well. Penicillin also became more understood and its usage more widespread8 during this era.

 

World War I was the first modern era war where more causalities were inflicted by combat than by disease9. This was in part due to the better understanding of germ theory, better use of quarantine, better design of camps and sanitation, and more effective treatments. After World War I, development of vaccines continued and eventually exploded in the post-World War II era, resulting in the eradication of small pox, the near eradication of polio, and the removal of measles, mumps, rubella, whooping cough, tetanus, and other communicable diseases as deadly killers. By the latter part of the twentieth century, these diseases — once considered a common part of childhood — had become so rare that the vaccination side effects were viewed as more deadly than the threat of these horrific diseases themselves10. Indeed, in some parts of the world, vaccination rates have fallen enough that herd immunity11 no longer functions and innocent and un-inoculated children die due to misinformed fears that vaccines cause autism12.

 

Several other major changes in medical practices took place in the twentieth century. The first was the practice of sterilization for both medical instruments and care-givers (via heating, boiling, use of isopropyl alcohol or other cleansing and anti-bacterial agents), more effective anti-biotics, better hospitalization and quarantine practices for outbreaks, germ theory, and the ability for doctors to dissect human corpses in order to better understand the human body itself. Another major change was the introduction of medical “insurance.” This privilege was first available to the rich and was more akin to an understanding between the doctors, hospitals, or other providers that the patient or his estate would provide recompense for treatments given. As medicine advanced as a science during the 1900s, resulting in the development of laboratories for testing and identification, insurance moved to cover these services. When the Second World War broke out and women flocked to the factories and companies were forced to find non-financial ways to attract workers — such as medical insurance or pension plans — medical insurance became more widespread13 in the United States.

 

During the latter half of the 1900s, insurance companies were forced to explore ways to reduce costs. With the introduction of the government into the medical market through Medicaid and Medicare (which resulted in its own set of problems14), insurers who had relied on group policies needed to find ways to cut costs. This, broadly speaking, resulted in the creation of “networks” for doctors, Healthcare Maintenance Operations and Preferred Provider Organizations15. Costs to the patient were masked by insurance agencies, the government, and the doctors’ long-established practice of not posting costs. As the 1900s came to a close and the 2000s began, some aspects of modern medicine that were not covered by insurance, such as LASIK and other procedures are forced to compete on price as well as satisfaction of outcome16. Some private hospitals have even begun to post their procedural costs such as one Oklahoma City surgery center17.

 

So, with all these changes, what impact has the ACA actually had? We’ll explore that a bit in the next part. Do bear in mind, however, that the ACA is a very new law and that there are several controversies over it above and beyond the partisan politics. The next installment will deal with those as well as the history behind the expansions of power that let the current administration think and believe the way it does regarding law and legislative process.

 

— G.K. Masterson

 


1 Childbirth in Early America. Additionally, many midwives of this era left the mother alone after the baby was out of the birth canal. The mothers were forced to bear their own placentas and dispose of them without any assistance (Lying In: A History of Childbirth in America).

 

2 Consumption, the great killer

 

3 Cholera

 

4 Dissection — History

 

5 Epidemiology — History

 

6 Epidemiology

 

7 Germ Theory of Disease: Louis Pasteur

 

8 History of Penicillin

 

9 >World War I Casualties

 

10 Anti-Vaccination Movement

 

11 Vaccine Opt-Outs Causing Breaks in “Herd Immunity”LA Times

 

12 Autism and Andrew Wakefield

 

13 Why the ACA Won’t Do Squat: Part I — Learning the Players

 

14 Why the ACA Won’t Do Squat: Part I — Learning the Players

 

15 HMOs vs. PPOs – What Are the Differences Between HMOs and PPOs?

 

16 Why the ACA Won’t Do Squat: Part I — Learning the Players

 

17 Oklahoma City hospital posts surgery prices online; creates bidding war