Into the Unknown

Addressing uncertainty in medical education

Image: Getty Images

The human mind has evolved a tendency to look for concrete facts and sharply delineated categories, such as true and false, or black and white. But the world around us is more often awash in nuance, ambiguity and gray areas. Understanding this is particularly important for physicians and patients facing the complexity and ambiguity of human health.

Richard Schwarzstein, the Ellen and Melvin Gordon Professor of Medical Education and director of The Academy at Harvard Medical School addressed the theme of uncertainty as he kicked off the Academy’s 2017 Medical Education Day program of workshops and speakers on Oct. 24 .

Get more HMS news here.

“As far as the laws of mathematics refer to reality, they are not certain, and as far as they are certain, they do not refer to reality,” Schwartzstein began, quoting Albert Einstein.

“If that’s true for math, it’s really true for medicine,” Schwartzstein said.

This year’s Med Ed Day program offered workshops on the basic science of uncertainty and the challenges of communicating ambiguity to patients.

The breadth of workshop categories reflected the importance of emerging thinking about uncertainty from many different angles in medicine, the organizers and speakers said. The offerings ranged from how students are admitted to medical school based on their ability to choose a correct answer on multiple-choice tests or memorize facts, to the way clinicians and patients sometimes interpret relatively small but statistically significant effects, to the unwarranted faith that many in the health care field might place in results from imaging tools and genetic screenings.

Schwartzstein noted that in his own clinical work he has tried to move toward asking trainees for their hypothesis instead of their diagnosis.

“I ask, ‘What do you think is going on?’” he said. “The term ‘diagnosis’ connotes a certainty that is not always warranted.”

One of the biggest challenges for physicians—and patients—comes in interpreting complex statistical information, Schwartzstein said, noting that simple differences in the way information is communicated can change the way patients react.

He cited a study showing that patients who were told they had a 10 percent chance of dying after a surgical procedure made different choices about their treatment than patients who were told they had a 90 percent chance of surviving, even though both statements reflect identical risks.

The Med Ed Day featured speaker, Steven Hatch, an associate professor and infectious disease doctor at the University of Massachusetts Medical School and author of Snowball in a Blizzard: A Physician’s Notes on Uncertainty in Medicine, discussed examples of the challenges that physicians face when presented with ambiguity and complex data.

He said that he began exploring the question of uncertainty in medicine deeply when controversy erupted over changes in guidelines on the frequency of mammography screening in women.

Because of the relative rarity of breast cancer, even a fairly reliable screening technique like mammography can lead to many more mistaken diagnoses of breast cancer than correct diagnoses. For example, when 21.5 million women are screened, there will be 215,000 false positives and only 36,000 actual cases. The screening will also fail to diagnose 360 women who do have breast cancer.

Early guidelines suggested frequent testing over many years of a woman’s life, but the large number of false positives were ignored, resulting in many women suffering psychological stress from receiving a cancer diagnosis when they did not have cancer, in addition to the physical harm of receiving biopsies and other unnecessary procedures, Hatch said.

Whether it is due to the culture of medicine, or shared, evolved human errors of cognition, Hatch said, “we place a high value on certainty without also recognizing that uncertainty is ubiquitous.”

Hatch said that some evolutionary psychologists have hypothesized that humans may have evolved not in spite of our cognitive errors but because of them.

He cited the error management theory, which suggests that those of our ancestors who were more likely to “overdiagnose” risks in their environment—by running away from a stick that looks like a snake, for example—were more likely than those who made the fatal error of “underdiagnosing” the potential risk of a snake “disguised” as a stick.

These cognitive biases may no longer make sense in the context of complex genetic screenings or imaging diagnostics that provide more data than current science is able to parse.

“Uncertainty is everywhere, and nobody in this room is immune,” Hatch said. “My hope for my students is that they become less certain without becoming less knowledgeable.”