ChatGPT for Self-Diagnosis: AI Is Changing the Way We Answer Our Own Health Questions


For better and for worse, AI gives us a new way to take symptom-checking into our own hands. For those with chronic health conditions, it's a new tool.

Katie Sarvela was sitting in her bedroom in Nikiksi, Alaska, on top of a moose-and-bear-themed bedspread, when she entered some of her earliest symptoms into ChatGPT.

The ones she remembers describing to the chatbot include half of her face feeling like it's on fire, then sometimes being numb, her skin feeling wet when it's not wet and night blindness.

ChatGPT's synopsis?

"Of course it gave me the 'I'm not a doctor, I can't diagnose you,'" Sarvela said. But then: multiple sclerosis. An autoimmune disease that attacks the central nervous system.

Now 32, Sarvela started experiencing MS symptoms when she was in her early 20s. She gradually came to suspect it was MS, but she still needed another MRI and lumbar puncture to confirm what she and her doctor suspected. While it wasn't a diagnosis, the way ChatGPT jumped to the right conclusion amazed her and her neurologist, according to Sarvela.

ChatGPT is an AI-powered chatbot that scrapes the internet for information and then organizes it based on which questions you ask, all served up in a conversational tone. It set off a profusion of generative AI tools throughout 2023, and the version based on the GPT-3.5 large language model is available to everyone for free. The way it can quickly synthesize information and personalize results raises the precedent set by "Dr. Google," the researcher's term describing the act of people looking up their symptoms online before they see a doctor. More often we call it "self-diagnosing."

For people like Sarvela, who've lived for years with mysterious symptoms before getting a proper diagnosis, having a more personalized search to bounce ideas off of may help save precious time in a health care system where long wait times, medical gaslighting, potential biases in care, and communication gaps between doctor and patient lead to years of frustration.

But giving a tool or new technology (like this magic mirror or any of the other AI tools that came out of this year's CES) any degree of power over your health has risks. A big limitation of ChatGPT, in particular, is the chance that the information it presents is made up (the term used in AI circles is a "hallucination"), which could have dangerous consequences if you take it as medical advice without consulting a doctor. But according to Dr. Karim Hanna, chief of family medicine at Tampa General Hospital and program director of the family medicine residency program at the University of South Florida, there's no contest between the power of ChatGPT and Google search when it comes to diagnostic power. He's teaching residents how to use ChatGPT as a tool. And though it won't replace the need for doctors, he thinks chatbots are something patients could be using too.

"Patients have been using Google for a long time," Hanna said. "Google is a search."

"This," he said, meaning ChatGPT, "is so much more than a search."

Is 'self-diagnosing' actually bad?
There's a list of caveats to keep in mind when you go down the rabbit hole of Googling a new pain, rash, symptom or condition you saw in a social media video. Or, now, popping symptoms into ChatGPT.

The first is that all health information is not created equal — there's a difference between information published by a primary medical source like Johns Hopkins and someone's YouTube channel, for example. Another is the possibility you could develop "cyberchondria," or anxiety over finding information that's not helpful, for instance diagnosing yourself with a brain tumor when your head pain is more likely from dehydration or a cluster headache.

Arguably the biggest caveat would be the risk of false reassurance fake information. You might overlook something serious because you searched online and came to the conclusion that it's no big deal, without ever consulting a real doctor. Importantly, "self-diagnosing" yourself with a mental health condition may bring up even more limitations, given the inherent difficulty of translating mental processes or subjective experiences into a treatable health condition. And taking something as sensitive as medication information from ChatGPT, with the caveat chatbots hallucinate, could be particularly dangerous.

But all that being said, consulting Dr. Google (or ChatGPT) for general information isn't necessarily a bad thing, especially when you consider that being better informed about your health is largely a good thing — as long as you don't stop at a simple internet search. In fact, researchers from Europe in 2017 found that of people who reported searching online before their doctor appointment, about half still went to the doctor. And the more frequently people consulted the internet for specific complaints, the more likely they reported reassurance.

A 2022 survey from PocketHealth, a medical imaging sharing platform, found that people who are what they refer to as "informed patients" in the survey get their health information from a variety of sources: doctors, the internet, articles and online communities. About 83% of these patients reported relying on their doctor, and roughly 74% reported relying on internet research. The survey was small and limited to PocketHealth customers, but it suggests multiple streams of information can coexist.

Lindsay Allen, a health economist and health services researcher with Northwestern University, said in an email that the internet "democratizes" medical information, but that it can also lead to anxiety and misinformation.

"Patients often decide whether to visit urgent care, the ER, or wait for a doctor based on online information," Allen said. "This self-triage can save time and reduce ER visits but risks misdiagnosis and underestimating serious conditions."


2024-01-14 18:16:38