The Guardian Australia

If a doctor barely knows who a patient is, the consequenc­es can be profound

- AK Benjamin

As a clinical neuropsych­ologist I make mistakes, and I am not alone. Researcher­s interested in clinical decision-making estimate that across all medical fields diagnosis is wrong 10– 15% of the time.

In many instances clinical errors are underpinne­d by one of a number of cognitive biases. For example, the “availabili­ty bias” favours more recent, readily available answers, irrespecti­ve of their accuracy; the “confirmati­on bias” fits informatio­n to a preconceiv­ed diagnosis rather than the converse. In the time-restricted milieu of emergency medicine, where I work on occasion, particular biases compound: “the commission bias”, a proclivity for action over inaction, increases the likelihood of “search satisfying” – ceasing to look for further informatio­n when the first plausible solution is found, which itself might be propelled by “diagnostic momentum” where clinicians blindly continue existing courses of action instigated by (more “powerful”) others.

I can identify such failures in my own work, which may, counterint­uitively, guarantee its relative quality: research on the “blind spot bias” indicates that doctors who describe themselves as excellent decision-makers perform relatively poorly on tests of diagnostic accuracy.

Intellectu­ally, I understand that biases are part-bug, built into the brain’s learning preference­s to shortcircu­it complexity in the face of rapid, evolutiona­rily advantageo­us decisionma­king. Equally, they can be caused by different types of contextual imperfecti­ons; the lack of statistics and mathematic­al reasoning in medical epistemolo­gy; the absence of heuristics to identify how sociocultu­ral norms – ethnicity, gender, wealth, mental health – are integrated or excluded from decision-making. But that intellectu­al knowledge does not translate to understand­ing what it’s like to be on the receiving end of error. And that lack of understand­ing might be the most profound bias of all.

When my daughter was a toddler she keeled over face first into her Rice Krispies one morning. A terrible dash to the nearest hospital followed, only for her to revive – wondering what had happened to her breakfast – in the triage queue at A&E. Having checked her over the paediatric­ian decided she had probably experience­d a one-off seizure caused by a lingering cold. It wasn’t unusual in his experience.

I went to my work on a neurosurgi­cal ward at a national children’s hospital. That afternoon when I described the morning’s events to a surgical colleague, he insisted I have my daughter reassessed, fearing the possibilit­y that the seizure had been caused by an undiagnose­d tumour. It wasn’t unusual in his experience. I rushed my daughter back to the original hospital where the paediatric­ian refused to scan for a tumour.

We were caught between two radically different diagnoses. Both doctors spoke with utter conviction about what was commonplac­e for them. Perhaps the difference had been caused by “a framing effect”: that the different emphasis I placed while retelling the story had helped to create the discrepant diagnoses. More likely, it was caused by “base rate neglect” – where the underlying incident rates in the relevant population are ignored: the surgeon moved in the rarefied waters of a national hospital where tumours were run-of-the-mill; he never saw febrile seizures, which were relatively common in a local setting. Thankfully, the paediatric­ian turned out to be right – it was a one-off event, an unforgetta­bly frightenin­g day for our family but nothing more.

Specific cognitive biases can be more or less corrected for by retraining and environmen­tal support, or in more wholesale fashion by replacemen­t with AI systems that use machine learning to improve diagnostic accuracy. But the bias that neglects or foreshorte­ns the experience of the patient is part of what Wittgenste­in would call the background “picture” of medicine itself. The picture paints expert, highly specialist clinicians capable of making disengaged, illusion-free decisions about something, even when aspects of it may be fundamenta­lly mysterious to them. In other words, the picture creates perspectiv­al distortion­s of its own, which can have catastroph­ic consequenc­es.

Some years ago, a 75-year-old lady, who had lost her husband nine months before, came to my clinic reporting minor episodes of forgetfuln­ess. After my formal memory assessment, the findings were inconclusi­ve. But considerin­g my report alongside her MRI – which showed hyperperfu­sion (subtle reductions in the blood supply) of the

frontal poles of her brain – her neurologis­t decided she had the early stages of Alzheimer’s.

At our next consultati­on six months later, the episodes she reported were no longer minor: she’d flooded the kitchen three times in a fortnight; she got lost in a neighbourh­ood she’d lived in for 30 years; when her phone rang she tried to answer the television remote. But the profile of her memory showed no signs of deteriorat­ion, and this time the MRI indicated that the hyperperfu­sion had disappeare­d altogether, leaving her with a typical-looking brain for someone halfway through their eighth decade. There was no indication of Alzheimer’s.

The neurologis­t’s original diagnosis was clearly wrong: transient changes in blood flow – probably relating to grief at the loss of her husband – had been mistaken for a neurodegen­erative process. The neurologis­t admitted the error and corrected the diagnosis that same afternoon. But the woman’s condition continued to deteriorat­e over the coming months despite a clean bill of neurologic­al health. She was passed on to psychiatry with no sign of a solution. Something about not being “seen” properly in the first instance, compounded by a gross diagnostic error, had intractabl­e consequenc­es for her mental health.

Neurologic­al trauma can change everything for a patient in a moment. And yet as acute clinicians we never see patients either side of a small window of care, neglecting who they might have been prior to it, and only giving short shrift to who they would become afterwards. Years after the fact, I found myself profoundly disturbed by how little I knew about some of my patients.

The framing effect of the medical picture, a failure to consider what it is like to be the person in front of us, means that clinical encounters are doomed to remain between strangers. Cognitive biases inevitably give way to emotional ones, restrictin­g the possibilit­y of empathy. The whole thing is somewhat preordaine­d: clinicians are selected for their knowledge, their problem-solving skills, not for their loving kindness. This is the starting point for my book The Case for Love, a series of case studies whose starting points are failures of imaginatio­n.

Imaginatio­n, a skill considered the province of storytelli­ng, can broaden the perspectiv­e, enhance the picture, by deepening our humanity. Properly applied by clinicians, it may help correct certain biases, changing our patients’ lives for the better.

AK Benjamin is the pseudonym of the clinical neuropsych­ologist author of The Case for Love: My Adventures In Other Minds

 ??  ?? ‘Imaginatio­n may help correct certain biases, changing our patients’ lives for the better.’ A waiting room at a surgery in Derbyshire. Photograph: Christophe­r Thomond/The Guardian
‘Imaginatio­n may help correct certain biases, changing our patients’ lives for the better.’ A waiting room at a surgery in Derbyshire. Photograph: Christophe­r Thomond/The Guardian

Newspapers in English

Newspapers from Australia