The price of sacrificing our analytical skills 

Caroline Giroux, MD, FRCPC

Author information:

Professor of Psychiatry, Psychiatrist, Director of RESTART (Resilience, Education, Supportive Tools for Adults Recovering from Trauma), University of California, Davis Medical Center, Department of Psychiatry and Behavioral Sciences, Sacramento, California, USA. [email protected]

 

There are no conflicts of interest. AI was not used in the production of this article.

 

Introduction

Artificial intelligence (AI) didn’t invent algorithms. However, when we think of algorithmic medicine, this is often what we are referring to: the decision-making process based on AI-generated algorithms. 

In this article, I aim to highlight the dangers of relying solely on algorithms in general when approaching patient care. The goal isn’t to go at war against AI or its algorithms but rather raise awareness about its limitations and the need to stay mindful and use it with discernment. AI shouldn’t take over medicine. It cannot replace the human-to-human healing encounter. Instead, it should be integrated. The intuitive qualities of the doctor connected to broader consciousness and wisdom are essential elements of the subjective and experiential components inherent to the common factors in psychotherapy, those main ingredients of the general recipe at the foundation of therapeutic alliance and without which the therapeutic endeavor cannot be led to fruition. 

 

Case scenario – rule out anorexia nervosa in a teenage boy

Take the example of a teenage boy, let’s say, 14 years of age. I have encountered a clinical situation of child abuse that the system did nothing about because there were no visible bruises and no immediate danger (by immediate, the system means within the day. Is the child at risk of dying today ? If not, they don’t meet the “medical necessity’’ criterion. I once lost my patience over the phone with such a procedural person -one of those who would be described as ‘’heartless’’ by the very perceptive teenage boy-  and responded, “are you waiting for people to jump off a cliff to decide that they qualify for services ?”). 

  

The algorithms are like a set of predetermined sequences depending on the answer to each question. Based on their formulated hypotheses, doctors will order certain tests. It is important to have such protocols in place. However, there is a risk of mechanical application of rules that can be limiting in a more complex scenario. Let’s take again the concern regarding the teenage boy who presented with a very low BMI. ”It is not anorexia, they don’t meet the DSM criteria”. Okay, but let’s say, if they were a girl, would it be likely that she would have become amenorrheic given the chronic low BMI of 15 or 16 ? If so, the professionals would be alarmed as they should. Then, why is this not taken as seriously in a male teenager ? 

 

”It is not anorexia, he doesn’t have body image issues, nor purging behavior”’. This is a part where algorithms get tricky and counterproductive: they tend to function unidirectionally. In this case, since the young patient doesn’t have the most obvious causes of restriction (such as body dysmorphia and purging), then it interrupts the sequence for the appropriate treatment plan. Never mind though that the snapshot of the child is that he does present like he has anorexia: he has the phenotypical features (pale, depressed), weird eating behaviors (doesn’t eat with family and snacks at odd hours, mostly on sweets) and his neutrophil count is in the lower range. And not only he fails to gain weight during adolescence, but he has a BMI below 17, even 16 at some point and has bradycardia, he looks severely depressed and disengaged, and his grades are dropping. He went from being an A and B student to obtaining Fs… It shouldn’t matter how he got there when it comes to deciding on the indication for urgent interventions. Because such a low weight is concerning. It might take too long to fully reverse the cause (such as trauma, household dysfunction, damaging conditioning or beliefs around food). We simply cannot afford to wait until we find the cause and fix it. We must intervene and ensure medical stabilization to prevent complications, because we should be reminded that anorexia has the highest mortality rate among all mental disorders. It is a potentially very severe condition, and a vicious one: low weight affects brain functioning, which limits insight and judgment, and increase depression, perpetuating the barrier to engage in adequate treatments and ingesting food… 

“I checked his pulse, it is normal” (rather vague, but, again, okay. Yet, do we have to wait until the heart rate is very low to intervene ? Before the child gets into full bradyarrhythmia ?) 

Can we stop over-relying on the list of criteria (especially when they include potential causal mechanisms, because this is not a comprehensive list of factors) instead of looking at the totality of the disorder ? Clearly, the depressed affect and low weight in a growing teenage boy living in a world of plentiful food signal that something is not right.  

Our minds seem to have evolved using shortcuts. In medicine, this is a bad idea… 

Erin Reinhart said it well: ”We risk entering a perverse loop: machines are supplying the language with which patients relay their suffering, and doctors are using machines to record and respond to that suffering. This cultivates what psychologists call “cognitive miserliness”, or a tendency to default to the most readily available answer rather than engage in critical inquiry or self-reflection”. [1] 

  

Limitations of the Diagnostic and Statistical Manual of Mental Disorders (DSM)

We have to understand something about the very cookie-cutter classification system that the DSM represents. Its siloed categories are just like the primary colors, while the human condition has all kinds of tones, shades, nuances, some we might not have even seen yet in the palette of clinical presentations. The different listed conditions are divisive instead of integrative. A diagnosis doesn’t tell us who the person we wish to assist is. It just generates more biases. Two people with the same diagnosis, even endorsing the same criteria, have different backgrounds, circumstances and underlying mechanisms of disease. Therefore, a transdiagnostic narrative is more comprehensive and helpful.

  

According to Dr Reinhart: ‘’AI is no less biased; it is merely biased differently, and in ways harder to detect. Models rely on existing datasets, which reflect decades of systemic inequities: from racial biases baked into kidney and lung-function tests to the underrepresentation of women and minorities in clinical trials. Pulse oximeters, for example, systematically underestimate hypoxemia in people with darker skin tones; during the Covid pandemic, these errors fed into triage algorithms, delaying care for Black patients (…). Once such biases are embedded in protocols, they persist for years. [1] 

  

Maybe we need to put aside all the textbooks for a moment. And be fully attentive, present, so that we can fully listen to the patient. To what they say through their suffering, whether it is sad eyes, low weight, or any form of lack of aliveness. Because the patient IS the textbook. 

The tunnel vision of our field has become quite alarming. We need to relearn our role as healers. We need to start seeing our duty to try to understand each patient as the sacred mission it truly is. 

I don’t want to even begin to think how many casualties the recipe book approach created by the DSM has caused. If we asked each person what they have or experience, they would say some equivalent of “I’m in pain”, “I suffer”, or “I am miserable”. Which DSM diagnosis fully captures that ? 

DSM is eroding our deep analytical skills. It turns us into mechanical administrators. 

 Sure, algorithms have a place. In the emergency room, for instance. But it seems like our field has lost the ability to discern when algorithmic approaches are no longer helpful by preventing us from seeing the bigger picture. 

Our highly bureaucratized medicine, especially in the US healthcare system, is unfortunately conducive to such “one symptom, one test, one pill” reductionistic approach. Weight loss in a patient has a different meaning and context than weight loss in another. We need to do a better job at being comprehensive and integrative. Because healing, at least in psychiatry, is about just that: not seeking to just eliminate the symptoms (emotional) but understanding them so they get integrated. 

  

Conclusion 

I agree with Dr Reinhart who wrote that we must collectively recall the foundation of caregiving that has been obscured under US health capitalism. [1] 

Maybe AI is a misnomer, because AI is mostly information. In my opinion, it functions like a very observant and skilled psychopath. Therefore, it cannot replace human compassion and wisdom. The human doctor, unlike AI, is a portal to deep wisdom. And deep wisdom is the collective consciousness, the real form of intelligence as opposed to the intellect (which consists of words, concepts, algorithms, tasks AI is very good at). For the benefits of our patients, let’s not sacrifice the ancestral knowing and perceptive abilities. Psychopaths, as conceptually ”intelligent” (AKA performing in the realm of the intellect) as they may be, lack empathy. 

And how do we ensure we maintain our critical thinking, our discernment in medicine ? We need to embody clarity. We must keep in mind our chronic risk of staying too close to the tree and missing the whole forest. And as such, regular meditation can help expand our awareness and prevent us from falling into the trap of linear, unidirectional, binary, algorithmic thinking.  Because when we have decent meditation mileage, we can see the totality of a situation in all its dimensions and nuances: the problem a person like the adolescent boy struggles with, the web of interconnected factors that likely caused it, and its potential, multifaceted and integrative solutions. 

  

Reference:

  1. Reinhart E. What we lose when we surrender care to algorithms. The Guardian, 2025 Nov 9. https://www.theguardian.com/us-news/ng-interactive/2025/nov/09/healthcare-artificial-intelligence-ai