U Magazine
U Magazine
UCLA Health
 
David Geffen School of Medicine
 
Features

How Doctors Think

In the complex, ever-changing world of modern medicine, physicians must always be prepared to test their assumptions.

How Doctors ThinkBy Kathy Svitil
Illustration by Vigg


Case No. 1 is a 49-year-old woman who was vacationing in Mexico when she had a sudden onset of muscle weakness that put her into a wheelchair, unable to move her arms or legs. She had a history of a rare, sometimes autoimmune-related blood disorder and she suffered from dry mouth as well as long-standing symptoms of paresthesia - pins-and-needles sensations - in her face. She initially was diagnosed with acute trichinosis, a parasitic disease caused by eating raw or undercooked pork infected with roundworm larvae. The rationale for that diagnosis, writes UCLA's Barbara G. Vickrey, M.D., M.P.H., who described the case in a recent issue of the Annals of Neurology, "was that there must be a tie to the onset of the illness in Mexico."

Laboratory findings, however, pointed to an entirely different diagnosis: Sjögren's syndrome, an autoimmune disorder in which immune cells attack the glands that produce saliva and tears and that can lead to weakness and joint and muscle pain.

The woman, noted Dr. Vickrey, was initially diagnosed with trichinosis because of a common type of error in diagnostic reasoning known as a framing effect, which arises when we are unduly influenced by one piece of information - in this case, a visit to Mexico, where the parasitic roundworms that cause the disease are more common - while other pieces of information that would lead to an accurate diagnosis are ignored. These types of cognitive errors arise in clinical medicine because of the shortcuts, or "heuristics," that doctors unconsciously use to help them sort through large amounts of information to come up with a diagnosis. Heuristics are useful and necessary, says Dr. Vickrey, professor of neurology and a health-services researcher who specializes in improving quality of care, "because they often lead to correct diagnoses in an efficient time frame" - but they also can lead to misdiagnoses.

In one classic example, says Margaret Stuber, M.D., UCLA professor of psychiatry and biobehavioral sciences, "An African-American man comes into the ER in a great amount of pain and wants pain meds. He has track marks, so the doctor decides that he is a junkie, looking for a fix." It's an easy conclusion to draw - and one seemingly based on the facts at hand - and yet it is entirely wrong: The "junkie" is, in reality, a patient with sickle-cell anemia, a genetic disease, more common in individuals of African descent, characterized by red blood cells with a faulty "sickle" shape that can block blood flow, causing pain, infections and organ damage. The man's track marks are not from drug abuse but from repeated blood transfusions, and he was in severe pain because he's in sickle-cell crisis. "Errors in cognitive reasoning can lead you to make some really bad instantaneous judgments," Dr. Stuber says.

Dr. Barbara G. Vickrey

Dr. Barbara G.: "Errors in cognitive reasoning can lead you to make some really bad instantaneous judgments."

Dr. Margaret Stuber

Dr. Margaret Stuber: “One of the easiest ways to believe you know everything is to never be exposed to new material, so we teach people not to trust any textbook that is more than four years old.”

Just how frequent are diagnostic errors? "This issue has not been well-evaluated," says Dr. Vickrey, in part because of a very human reluctance to talk about something that didn't go well (as well as concerns about potential liability from admitting error), and also because researchers have traditionally focused on treatment errors (say, giving the wrong medication). "Diagnosis errors are harder to study, because it's much more difficult to figure out everyone who was missed," she says. "But, they're not so rare."

However, she adds, "it is important to distinguish between an error that leads to harm versus one that could have caused harm. Not all errors mean that something bad happens, but we need to teach people in medical education to be aware of these kinds of errors and understand their causes, so that we can figure out how to avoid them.

If you are aware that these errors of thinking can occur, you have a better chance of catching yourself. It's a 'metacognition' approach."

Cognitive psychologists have grouped the most common of these errors into five different categories - framing effects, as noted above; anchoring heuristics; availability heuristics; representative heuristics; and blind obedience - all of which arise from cognitive mistakes that "largely reside below the level of conscious thinking," explained Harvard Medical School professor and New Yorker staff writer Jerome Groopman, M.D., in his best-selling book on the topic, How Doctors Think, and yet are inherent in how humans have evolved to think.

Diagnostic errors, note physicians David E. Newman-Toker and Peter J. Pronovost of Johns Hopkins University School of Medicine in a 2009 commentary in the Journal of the American Medical Association, are the "next frontier for patient safety." They point out that "practical solutions to reduce diagnostic errors have lagged behind those in other areas of patient safety [because] computer-based diagnostic decision support systems, often touted as the optimal strategy to reduce misdiagnosis, have not been validated against patient outcomes, and none is in widespread clinical use." Indeed, they continue, "Diagnosis is still largely viewed as an individual art rather than evidence-based science."

Improving diagnostic accuracy, said Drs. Newman-Toker and Pronovost, "will likely require a multifaceted approach that includes renewed emphasis on traditional clinical-skills teaching, exploration of new methods for diagnostic education, major improvements in health information-technology systems, and a substantial investment in the basic science of clinical diagnosis."

To that end, UCLA Health System and other leading medical centers have put into place a number of "systems solutions," which can be either high- or low-tech, to help doctors avoid these errors. "Some processes have to exist on a scale larger than just patients," says Dr. Stuber. "For example, you shouldn't ever be able to give a patient the wrong medication because it is not marked clearly, or the orders are not written down clearly, or the dosages are not standardized. All of those things should be done according to protocol, with institutional procedures in place, so that it is very hard to make those kinds of mistakes."

In places like the Intensive Care Unit, patients may have so many concurrent problems that they can present what Dr. Vickrey calls "information overload," which can cause physicians to miss certain trends that could inform their decision-making. "So there is ongoing research to create specialized dashboards or graphical displays that make it possible to rapidly identify changes in complex data." However, she notes, "the systems solutions don't have to be fancy high-tech approaches."

Indeed, she says, clinicians may employ their own low-tech methods that can be as simple as posted signs or printed note cards listing the types of cognitive errors and corrective strategies that help them re-evaluate how they are looking at a case. Surgical checklists - akin to those used by pilots since the mid-1930s to describe every step that must be taken at every stage of a flight (including during emergencies) - are now increasingly common at top hospitals like UCLA to help reduce and prevent medical error. Such lists, which effectively provide "time-out" periods when doctors, nurses and staff can discuss critical issues and review information that is key to a case, can also help to focus clinicians as they develop diagnoses.

"For example," says Dr. Vickrey, "if a person comes into the ER with a headache, there is a 'don't miss diagnosis' checklist, with three or four questions that you should ask of every headache patient," such as if it is the patient's first headache, or worst headache ever, or if it had a sudden or abrupt onset. "If any of these are positive, the cause of the headache could be a rare but potentially life-threatening problem, such as a ruptured aneurysm. Without that checklist to refer to, some doctors will remember to ask those questions, but some might not. This takes the burden off the doctor to remember."

The idea, continues Dr. Vickrey, "is that with these systems in place, you don't have to rely entirely on a person to continuously self-monitor for potential pitfalls in his or her use of heuristics."

Of course, learning to quite simply think in an entirely new kind of way remains a cornerstone of medical education, especially at UCLA. When students first enter medical school, they immediately encounter what Neil H. Parker, M.D., describes as a "cultural shift" in how they must evaluate a given situation. "One of the problems is that you take a group of very talented individuals who have gone through a system where everything is about knowing the right answer, then you want them to think differently - to include all possible answers," says Dr. Parker, senior associate dean for graduate medical education at the David Geffen School of Medicine at UCLA.

"Previously, they've been rewarded when they found the ball - the one right ball. That's how they were tested. And now, you say to them there may not be a right ball. There might be ball A and B and C and D. It's a shift to a type of critical thinking that requires much more flexibility."

Dr. Vickrey concurs: "As doctors, we want to be right, so it is hard to admit that 'I don't know this' or to ask for help. In training doctors, we're working against the personalities of the type of people who become doctors. So my job," she continues, "is to teach people to tolerate ambiguity or uncertainty. They have to make decisions without complete data. If you think that a patient has meningitis, you have to treat him or her before you get the results of a lumbar puncture. That's hard for some people. They want all the data before they make a decision. But that takes time, costs more money, and can be bad for the patient. So you have to act without complete information."

Checklists of questions that absolutely must be asked or protocols that have to be followed offer valuable safety nets that help prevent bad outcomes, but they're still no substitute for a critical mind that asks questions and then evaluates, puts into context, and re-evaluates clinical evidence. After all, although a state-of-the-art Boeing 787 is an incredibly complicated piece of machinery, "the checklist used by its pilot is inclusive," Dr. Parker notes. In other words, while there may be many steps required to fly such a plane, "there are no unknown steps," he says. Not so in medicine, where every case is a unique puzzle - and one with an undefined number of pieces that may come in unfamiliar shapes and sizes and with no picture on the box to guide you.

"In medicine, we generally try to teach people how to collect information from their patients and how to evaluate that information, how to generate hypotheses based on the information, and how to test those hypotheses," Dr. Stuber says. "So what we try to do with students is have them learn to pretty systematically think about what is the most common thing it could be - because common things happen commonly - and what is the most dangerous thing it could be - where you would not want to wait to treat."

And if it is a third thing, not common enough to crop up frequently (and thus be on the mind) and not deadly enough to ring a standardized set of alarm bells? "Some of it requires pattern recognition," Dr. Stuber says. "Certain things tend to happen together, so you look for a suite of symptoms. One rule is to avoid premature closure, or stop thinking too early. You try to ask enough questions, so that you can evaluate at least three hypotheses to explain what you see. We teach students to ask open-ended questions - those that can't be answered by a 'yes' or a 'no' but require a story, like 'What happened?' - and to really listen to the answers, because they may lead to totally different questions."

Even as they do this, clinicians are also on the lookout for other clues: How much distress does the patient seem to be in? What is his pain level? How is he holding his body? Are there other signs of an acute and potentially very dangerous problem? "If a patient comes in with a headache that she's had for two weeks, is wincing when she moves or can't move her neck, but doesn't seem like she is in a lot of pain," she says, "it is probably tension or a pulled muscle; if she is having the worst headache of her life with sudden onset, you'd need to be on alert for a potential bleed in the brain or meningitis."

"When someone is a novice - a medical student in training - we slow the thinking process down," Dr. Stuber explains. "We give them information, ask them to write it down, and generate hypotheses. We ask them what they need to do next. We have them do research. Once you're an expert, you go through this process really rapidly. You have a bunch of scenarios in your head and are matching what you're hearing from the patient against that mental list. It's almost instantaneous."

At the same time, however, an experienced doctor is cognizant of both the cognitive errors that can lead him or her down a faulty diagnostic trail, and of what he or she does not know enough about. "One of the easiest ways to believe you know everything is to never be exposed to new material, so we teach people not to trust any textbook that is more than four years old," Dr. Stuber says. "Effective doctors are people who are continuously curious."

Kathy Svitil is a freelance writer and director of news for the California Institute of Technology.

 





Add a comment


Please note that we are unable to respond to medical questions. For information about health care, or if you need help in choosing a UCLA physician, please contact UCLA Physician Referral Service (PRS) at 1-800-UCLA-MD1 (1-800-825-2631) and ask to speak with a referral nurse. Thank you.