Exploring Illness
Across
Time and Place

c u l t u r e s

 

s y m p t o m s

Fevers and Chills


An Injured Limb


Swollen Sores

The Late 20th Century American Medical Worldview

The maintenance and preservation of health was central to daily life in the United States in the late twentieth century. Concerns about diet and physical fitness were regular topics of media discourse, as were stories about medical studies and revolutionary new treatments for disease. Roughly five percent of the labor force worked in the health care industry (1). Urban residents heard ambulance sirens on the streets and morning radio reports of "injury accidents" causing traffic snarls. The nightly news featured political figures debating health insurance crises, followed by prime-time dramas set in emergency rooms and sponsored by multi-national pharmaceutical companies advertising expensive new drugs. While more Americans had access to scientifically effective medical treatment than ever before, tens of millions also utilized herbal and alternative therapies not sanctioned by medical professionals.

A reductionistic and mechanistic conception of the body dominated within the medical community. In ascending level of complexity, bodies were biological entities composed of cells, tissues, organs, and organ systems. The invasion of external pathogens was believed to cause many illnesses. These pathogens included bacteria, viruses, environmental chemicals, and radiation. Pathogens damaged tissues, thus preventing organs from functioning correctly, resulting in the patient's experience of symptoms of disease. Other illnesses were believed to be caused by : one part of the body attacking another part for unknown reasons (auto-immune diseases); genetic mutations (various cancers); or chemical imbalances (depression, traumatic shock). Chemical imbalances were increasingly invoked to explain mental and psychological illnesses in the last two decades of the twentieth century. Emotional trauma was also cited as a cause of mental illnesses, sometimes complementing, sometimes in contradiction to, chemical explanations.

Medicine's mechanistic understanding of the body and scientistic ideology generally excluded moral judgments from the official domain of medical practice. A doctor's duty was to heal the sick, not judge their character. However, moral judgments remained a powerful part of the experience of certain illnesses and treatments. Doctors often admonished patients regarding particular behaviors or physical characteristics, such as obesity, smoking, or unprescribed drug use. And as Susan Sontag described in various essays, society continued to use moral metaphors to evaluate people living with cancer and AIDS (2). These moral judgments were particularly powerful in shaping the treatment of marginalized individuals (especially Haitian immigrants, homosexuals, and IV-drug users) suffering from AIDS in the 1980s.

Medicine's reductionist understanding of the body also contributed to the development of highly efficacious treatments for some illnesses. Antibiotic drugs, organ transplantation, and polio vaccination emerge from notions of removing diseased parts or isolating and targeting harmful invaders. While advances in nutrition, clean water and more sanitary living conditions were primarily responsible for the long life spans experienced by Americans, medical practice provided dramatic interventions that saved patients from formerly deadly infections and epidemics.

The dramatic efficacy of medical treatment contributed to the considerable prestige medical doctors and biomedical researchers had in American society. Medical schools were attractive options for successful students, and many schools were able to charge annual tuition well in excess of median household income levels. Unlike professors or lawyers, medical doctors were specially designated in residential phone book listings. Doctoring was a highly educated (and highly paid) profession, usually requiring eight to twelve years of specialized training and education beyond the secondary level. Extensive specialization within medicine contributed to doctors' prestige. Doctors were called upon to give privileged testimony in court cases. Likewise, doctors often had privileged access to media and information distribution resources, offering comments on medical advances, ethical dilemmas raised by new technologies, and issues regarding public health and policy. Doctors were also considered to be desirable as mates and spouses, despite the long and irregular hours that hospitals and clinics often required of them.

Other health care workers experienced many different social statuses. Hospital administrations were well-paid, distinguished, white-collar professionals, but were rarely accorded the social deference given to doctors. Nurses, while often honored for their empathy and care, were accorded significantly less pay and status than doctors. As other professions opened to women in the late twentieth century, nursing shortages became increasingly common. Male nurses remained an acceptable target of humor and mockery (3). Nurse aides, orderlies, receptionists, and janitorial staff all ranked lower in the medical hierarchy, while their labor was essential to increasingly complicated and ever larger medical institutions.

As evidenced by the differences in status between doctors and nurses, economic autonomy and workplace authority were important components of social prestige. However, many doctors felt these values were being eroded by the increasing role of the state and of large insurance and health maintenance companies from the 1960s onward. The emergence of new economic and regulatory structures was a crucial aspect of medical practice in late 20th century America. Health care was an enormous part of the national economy, accounting for 15% of the US gross national product (4). Patients, especially in the non-elite classes, found encounters with medicine to be quite bureaucrat and rule-governed. Indeed, late 20th century discourse increasingly focused, not upon individual interactions between healers and patients, but upon the regulation and control of the "health care system."

The "health care system" was responsible both for treating ill individuals and for maintaining a healthy population. Governments became involved in creating, regulating, and operating the health care system. The United States federal government paid a substantial share of the costs of running the structures involved in health maintenance. Government insurance programs reimbursed hospitals, doctors, and clinics for some expenses of treating elderly and poor patients. Government subsidies also supported hospitals, nursing and medical schools, and rural clinics. Semi-autonomous agencies like the National Institutes of Health (NIH) granted billions of dollars a year for health-related scientific research, including research into nutrition, aging, epidemiology, trauma care, biochemistry, physiology, viral reproduction, and even complementary and alternative medicine.

Health care was dispensed through a wide variety of institutions. Hospitals, and especially emergency rooms, became icons of medicine's heroic ability to alter the course of severe illness. Some hospitals developed national or regional reputations for treating particular illnesses, and drew patients from long distances. More routine sicknesses were treated in local urban or rural clinics, or a doctor's private practice, often located in a medical office building or storefront. Medical care for the aged and those considered physically or mentally handicapped was often provided through residential nursing facilities; otherwise, little health care was provided in homes. A substantial amount of health advice was dispensed over the telephone, usually by nurses, occasionally by doctors. The Internet also became an important source of medical information; here authorship was generally attributed to licensed doctors but was not offered as a substitute for professional consultations.

Notes

1. Porter, Roy. Blood and Guts: A Short History of Medicine. London: Penguin Books, 2002. p. 155

2. Sontag, Susan. Illness as Metaphor, and AIDS and its Metaphors. New York: Anchor Books, 1990. [Illness as Metaphor first published in 1978. AIDS and its Metaphors first published in 1989.]

3. See, for instance, the 2000 film Meet the Parents starring Ben Stiller as a male nurse.

4. Porter, 2002. p. 153

 

Bibliography for Further Reading

There are many, many, many books that describe aspects of medicine and treatment in the late 20th century United States. Below are four books which provide useful insight into the meanings of illness and medicine.

Hahn, Robert A. Sickness and Healing: An Anthropological Perspective. New Haven: Yale University Press, 1995.
A good introduction to the study of contemporary medicine, both western and non-western, through a cultural lens. Deals with clinical and lived patient experiences, texts, and the broader impacts of biomedicine.

Morris, David B. Illness and Culture in the Postmodern Age. Berkeley: University of California Press, 1998.
Beginning by separating disease (objectively verifiable disorder) from illlness (the subjective experience of being sick), Morris explores the ways that both illness and disease have changed in the latter half of the 20th century. Also incorporates literary responses to this change.

Porter, Roy. Blood and Guts: A Short History of Medicine. London: Penguin Books, 2002.
This brief, eminently readable history isolates themes (doctors, disease, the body) in separate chapters until the concluding chapter. This last section, "Medicine in Modern Society" integrates medical activities to reveal a 20th century medicine dominated by economic forces and government influence.

Sontag, Susan. Illness as Metaphor, and AIDS and its Metaphors. New York: Anchor Books, 1990. [Illness as Metaphor first published in 1978. AIDS and its Metaphors first published in 1989.]
Two compact essays that have become classics of the "analysis as liberation" literature. Brings out the moral meanings that continue to inhere in medical diagnosis.