The Diagnosis

“Don’t you understand? I am telling you. You have stage five cancer. Who cares if your students have an exam? Get yourself a biopsy!”

Shaken, she stumbled out of the radiologist’s office. He was so certain, she thought. Stage five, he said. It must be so clear to him. It must be huge. Even when he showed the screen. Stage five, it said.

This is the experience my mother recounted to me, coming back from the doctor’s office that night. “I thought you were checking your wrists?” I said, a little ashen. And she was. Except the radiologist noticed her throat and, I am sure, with all the good intentions an all-too-eager interloper could muster, he had simply taken it upon himself to help. Such confidence in a misdiagnosis would be laughable if it weren’t so personal.

The depth of training and experience required to become an expert remains as tumultuous and necessary as ever. It is especially so given that we face an unprecedented crisis in epistemic authority. We have heard, time and again, about losses in cognitive ability due to overt reliance on AI. Cognitive atrophy, they call it. But that struggle is a personal one. Between you, yourself, and sometimes your educators.

Yet when a doctor exercises judgment through an algorithm, all the while inflating it with an air of unearned authority, the chain of human reasoning on which the mirage of medical authority rests becomes obscured. Such authority, we think, is earned and not outsourced. What keeps the mirage, that is, the healthcare system in Iraq and anywhere else, sustainable is the trust that there is human reasoning we can examine and hold accountable. It is what induces thousands of patients, who flock to it, to suspend their own judgment and defer to an expert. An expert who, and this is important, has a human face and a human conscience.

The expert is the person who ought to take the evidence presented, interpret it in all the context and nuance the conversation can provide, and then answer for the results. This is how we have preserved a chain of human accountability. It is how the mirage remained sustainable. But who exactly answers for the black box?

A 2026 Lancet Digital Health report found that medical LLMs are more likely than human doctors to accept and regurgitate false information if it looks authoritative enough. The study cited showed that regurgitation of false information was at nearly 47% when they were placed in falsified discharge documents. In July 2025, Fortune reported a patient was misdiagnosed by Anima Health’s “Annie” as having diabetes. It then proceeded to invite said patient to a fabricated hospital address for diabetes screening.

Consumer-facing AI applications have already generated misleading or harmful advice, serious enough to alarm physicians. The local Iraqi flashpoint for such anxieties was illuminated a month prior to the Annie incident: a young Kurdish man named Zakaria Suleiman passed away, and the rumors that he turned to ChatGPT for medical advice circulated around the news and social media. It is the rapid circulation of such news and the fact that it is no longer something that gives us pause to believe it to be true that makes the crisis worth re-examining. 

Turning to technology for medical advice is nothing new. The commercial tagline of the dot-com era was "Please Do Not Confuse Your Google Search With My Medical Degree." It is, however, unprecedented territory when those to whom we defer judgment begin deferring to a faceless authority themselves. With agentic AI on the rise, the chain of accountability becomes even murkier. Here, the crisis stops being just an epistemological one. It is no longer a matter of borrowed authority from resolute-sounding AI but rather a crisis where responsibility itself is diffused across a healthcare environment already moving towards “proactive, multi-agent collaborative systems” so that the headline is no longer “AI makes mistakes" but that it is an authority in practice. An authority we have no way of holding accountable. 

- Zheen Salih Abdullah



Previous
Previous

A Breakdown of Trump's 2026 State of the Union Address

Next
Next

The Quiet Familiarity of Tea Breaks