A 23-year-old woman from Cardiff has been diagnosed with a rare neurological condition after putting her symptoms into ChatGPT, following four years of misdiagnosis by medical professionals.
Phoebe Tesoriere’s case highlights the mounting pressure on the NHS and raises questions about patients increasingly turning to AI for medical guidance. Her experience also shows how rare conditions can slip through the cracks of an overwhelmed healthcare system, while sparking debate about the role artificial intelligence should play in medical diagnosis.
Four years of wrong diagnoses
Phoebe’s medical journey began when she was 19 and collapsed at work with a seizure. Despite having no history of mental health issues, doctors attributed her symptoms to anxiety. This misdiagnosis was added to her medical records and shaped her treatment for years.
“I had no history of anxiety, I was a really happy, bubbly person,” Phoebe said. She had experienced symptoms since childhood, including a limp and balance problems, but these were attributed to hip surgery she had as a baby.
The misdiagnoses continued to stack up:
- Doctors initially blamed anxiety for her seizures
- In 2022, she was diagnosed with epilepsy and prescribed medication
- Later, she was misdiagnosed with Todd’s Paralysis, a condition linked to epilepsy
- After a major seizure left her in a coma for three days in July 2025, doctors told her she didn’t have epilepsy at all – it was anxiety
ChatGPT provides the breakthrough
Frustrated after years of being dismissed and feeling “really lonely” in her medical journey, Phoebe decided to input her symptoms into ChatGPT. The AI chatbot suggested several possible conditions, including hereditary spastic paraplegia.
“I went back and forth with my partner, questioning ‘do I go to the doctors?’, ‘do I not?’, ‘what should I do?’, ‘surely it can’t be that’,” she said. When she presented the AI’s suggestion to her GP, the doctor agreed it was a “plausible reason” and ordered genetic testing, which confirmed the diagnosis.
Hereditary spastic paraplegia is so rare that the NHS says it doesn’t know how many people have the condition because it’s often misdiagnosed. Symptoms can be managed through physiotherapy, but there’s no cure.
The human cost of misdiagnosis
The years of wrong diagnoses have taken a significant toll on Phoebe’s life. She can no longer work as a special educational needs teacher and now uses a wheelchair. However, she’s found new purpose and is studying for a master’s degree in psychology.
“I had to fight to be listened to,” she said, describing her experience with the healthcare system.
Cardiff and Vale University Health Board apologized for Phoebe’s experience and invited her to contact their concerns team to discuss her care further.
The growing role of AI in healthcare decisions
Phoebe’s case comes as millions of people worldwide are turning to AI for health advice. OpenAI reports that 230 million people ask ChatGPT health-related questions every week. In January, the company launched a new feature in the US designed to analyze medical records, though it stressed this wasn’t meant for “diagnosis or treatment.”
However, research from Oxford University earlier this year found that AI chatbots provide inconsistent medical advice, mixing good and bad responses that make it difficult for users to know what to trust.
Medical professionals weigh in
Dr. Rebeccah Tomlinson, a GP serving Cardiff and Vale of Glamorgan, sees both benefits and risks in patients using AI for health research. “It’s difficult for GPs to know everything. With the pressure on the NHS, we have to know even more,” she said.
She views AI tools as useful starting points for conversations, but emphasizes they should always be followed by discussions with medical professionals. “General practice has to be a two-way conversation,” she added, noting that GPs need to be “open and receptive” when patients bring information from AI sources.
The case highlights a broader challenge facing healthcare systems: how to balance the potential benefits of AI assistance with the need for proper medical oversight, especially when dealing with rare conditions that even experienced doctors might miss.
