ECU Logo
 
Department of Medical Humanities
Newsletter




 


NEWSLETTER
 
medical humanities newsletter
The Bioethics Center, University Health Systems of Eastern Carolina
Department of Medical Humanities, The Brody School of Medicine at East Carolina University
 
 
 
The Historical Origins of Medical Malpractice Litigation
Kenneth De Ville, Ph.D., J.D.

Before the 1830s, medical malpractice suits were rare.  All had changed by mid-century.  A flood of malpractice suits appeared in the late 1830s.  By 1860 John Elwell, a lawyer-physician who authored a book on malpractice, could claim that "There can hardly be found a place in the country where the oldest physicians in it have not been actually sued or annoyingly threatened." Many of the factors that underlie America's first malpractice crisis persist into the present.  Historically, medical malpractice suits have been the result of a combination of short-term topical causes and long-term cultural preconditions.  Just as importantly, the implementation of new technologies and procedures has played a consistent and central role during the history of the litigation.

Contemporary observers accurately identified the short-term, topical causes of this sudden outburst of litigation.  For example, the profession had suffered a clear loss of status since the latter half of the 18th century and physicians speculated that this decline in status contributed to the unprecedented spate of suits.  Moreover, profoundly degraded educational standards in the first half of the 19th century debased the profession in the eyes of the public and left practitioners ill prepared to deal with the complexities of the human body.  An increased number of medical schools produced a flood of physicians.  When physicians were rare, suits were unlikely, since an accusation of wrongdoing might drive the community's only educated physician to another town.  By 1850, when one physician left a community another was invariably waiting to take his place.  Finally, physicians plausibly believed that the virulent anti-professionalism of the period also contributed to their legal woes.

Long-term developments, however, play a more important role in the rise of litigation.  The first of these is a transformation in Americans' belief in divine providence.  Most Americans in the 17th and 18th centuries believed that physical misfortune was an explicit expression of divine will, inflicted to either test or punish.  Humble acceptance was the proper response to misfortune, not a lawsuit.  This attitude affected both potential plaintiffs and the juries that would weigh their claims.  The first half of the 19th century was marked by a period of dramatic and rapid religious transformation.  A variety of religious reform movements stressed human perfectibility over human depravity.  A greater portion of society began to believe that God observed, but did not intrude on ordinary affairs of day-to-day life.  At the same time, scientific progress strengthened the growing belief that physical ills could and should be changed on earth.  These changes allowed and even led individuals to look for earthly reasons and human culpability whenever they saw human suffering.  Without this religious transformation, widespread suits would have been unthinkable.

The second cultural transformation that allowed medical malpractice claims to flourish was the dissipation of a community ethos that tended to suppress lawsuits.  Cultural and community attitudes define socially acceptable ways to deal with grievances.  Legal anthropologists have suggested that communities characterized by face-to-face relationships and populated by economically self-sufficient farmers and merchants, such as those of 18th century America, considered it inappropriate to demand compensation for misfortune.  Lawsuits disturbed the peace of the community, violated religious-based community strictures against suing for misfortune, and threatened to rob the community of a valuable social resource, its physician.

Although some changes occurred as early as the 18th century, the most profound disruption of the traditional community took place in the early 19th century, the same period that the first malpractice practice crisis arose.  There is a clear movement from the corporate communalism in colonial America, to the more anonymous individualism of 19th and 20th century America.  As individualism became a greater feature in American life, community stigma against suits had less influence.  Without this change, individuals would not have felt free to sue on a wide scale.  Even today, lawsuits tend to be more numerous in urban communities and less numerous in rural ones.

The paradoxical role of technology in generating medical malpractice litigation illustrates a cycle that has recurred in the guise of other medical procedures in the late 19th and early 20th centuries.  Three-quarters of the suits at mid-19th century resulted from fracture and dislocation cases.  In 1800, the standard of care for severe fractures and dislocations had been amputation.  Amputations, however, did not generate a large number of suits.  Potential claimants typically had no limb left to present as evidence to experts and, although many amputees died, legal doctrine of the period significantly restricted wrongful death actions.  Most importantly, expectations were low for severe orthopedic injuries.  Amputations and/or death were the norm.

By the late 1830s, the profession had developed a dazzling array of new orthopedic techniques that allowed physicians to save rather than amputate limbs.  Amputations suddenly became less acceptable, and saving limbs in severe orthopedic injuries had become the standard of care by l850.  Medical treatises referred to fracture treatment as a relatively mechanical procedure in which physicians and patients could expect perfect cures.  This orthopedic revolution fostered inflated expectations in both the profession and the lay public.  The new orthopedic treatments required more knowledge, skill, and care than the old ones, however.  While physicians could now more frequently save limbs, fractures and dislocations usually yielded visible, permanent injuries, such as shortened or deformed limbs and frozen joints.

By the late 19th century, many of the factors that contributed to the malpractice crisis of the 1830s and the 1840s dissipated.  In the first half of the 20th century, anti-professionalism diminished, medical education improved, and the status of physicians increased dramatically. Despite the disappearance of most of the topical causes of the litigation in the first half of the 19th century, suits continued as new inciting factors arose to take the place of the old.  Status-based resentment, for example, was gradually replaced by a species of class-based resentment as physicians' income slowly increased.  Similarly, a more sophisticated plaintiff’s bar, media coverage, insurance issues, specific legal changes, attorney advertisement, all, at various historical moments, affected the tendency to sue physicians.

Moreover, the cultural preconditions for suits did not abate, but matured.  As the twentieth century unfolded, communities became more heterogeneous, individualism flourished, and communal restraints against suing weakened.  These transformed communities are less likely to suppress litigation.  For the last 150 years Americans have become even more secularized and more convinced that humans can improve their lives, and increasingly more convinced that there must be a remedy or a solution when something goes wrong.

Just as importantly, medical progress has continued to play a central role in malpractice litigation.  Physicians' experience with orthopedic treatment in the mid-19th  century illustrates a cycle that recurs as medicine advances in other areas of care.   According to Mark Grady, medical innovation “captures” what was previously natural risk and transforms it into medical risk.  Typically there are few suits until a particular technology is performed frequently, and both the profession and the public believe that it generates predictable results and substantial benefit.  Dramatic medical advancements are invariably followed by heightened, and frequently excessive, professional and lay expectations [1].  As Grady explains, improved procedures typically require greater learning, skill, and care.  As a result, technological advancement carries with it a greater opportunity for error or accident.  The cost of error associated with a new treatment is increased because its potential benefit is higher than the treatment that it replaced.  Suits arise, in part, when those newly acquired expectations are frustrated.

For example, by the early 20th century body cavity surgery was an increasingly common though an unevenly successful procedure.  Despite the growing number of procedures and mixed results, surgical malpractice suits did not increase dramatically.  Surgeons could boast of more noteworthy and numerous successes only after the advent of sulfa drugs to fight deadly infections in the 1930s, transfusions to assuage the effects of surgical shock, the development of residency programs to train surgeons, the refinement of aseptic practices, and the development of more reliable and appropriate instruments.  After that time, suits arising out of surgical treatments increased precipitously, overtaking orthopedics as the most common source of medical malpractice suits by the 1940s.

Likewise, obstetrical care was the source of relatively few medical malpractice claims until the 1970s-1980s.  Maternal and fetal risk had remained considerably high until mid-century, but improved rapidly thereafter.  By the 1970s, new drugs, medical regimes, and technologies had generated dramatic improvement in the safety of pregnancy, labor, and delivery.  Expectations increased, which contributed to disappointment and resentment over tragic outcomes.  By 1985, obstetric claims represented 10% of all malpractice suits.

Consider finally, the diagnostic advancements of the last half of the 20th century, which have improved care but engender the expectation that life-threatening and debilitating illnesses can be foreseen and thwarted by early intervention.  These heightened, and sometimes ill-informed, expectations are analogous to those generated by innovations in fracture treatment in the mid-19th century, surgery after the second third of the 20th century, or obstetrics after the 1970s.  As with these other therapies, a dramatic and sudden increase in malpractice suits has followed a period of dynamic improvement in a particular medical modality, in this case diagnosis.  Currently, the fastest growing medical malpractice allegation is the failure to diagnose an existing illness or injury.

The basic legal elements of medical malpractice actions have not changed appreciably in over 100 years.  Physicians are expected to possess and use the degree of knowledge, skill, and care exhibited by a reasonably prudent physician in the same or similar circumstances.  But while the wording of the physician's legal duty has not changed significantly, technological development has dramatically altered the essential content of that standard.  The knowledge, skill, and care that is required to perform responsibly in a late 20th century medical environment has far outstripped that which would have been required of a physician at the turn of the last century.

Overall malpractice rates increase because a greater number of individual patients sue physicians for the use of, or failure to use, a particular technology, procedure, or therapeutic approach.  The most visible increase in twentieth century malpractice rates has occurred in the last thirty years.  This increase may be attributed in part to specific factors such as the demise of the locality rule, attorney advertising, and a more specialized and organized plaintiff’s bar.  But the increase should also be traced to the daunting number of medical technologies that reached widespread and mature use during this period.  These technologies have generated both a public and professional belief in predictable results and substantial benefit.  While there were only a handful of such technologies available to physicians in the mid-nineteenth century, countless dozens have been incorporated into near routine use in the last thirty years.

(This article is based on KA De Ville, Medical Malpractice in Nineteenth Century America: Origins and Legacy,  NYU Press, 1990; and KA De Ville, Medical malpractice in twentieth century U.S.: the interaction of technology, law and culture.  International Journal of Technology Assessment and Health Care 1998; 98, 1: 99-103.)

REFERENCES

1.  Mark Grady, “Why are people negligent?,” Northwestern Univ Law Rev 1988; 82: 293-334.