Advertisement
Home LIFE & CULTURE Health & Wellness

More Women Are Turning To ChatGPT For Medical Advice But Is It A Dangerous Rabbit Hole?

"ChatGPT often produces nonsense, but it presents it so well it looks plausibleā€
chatgpt health advice

When a hard mass appeared in her abdomen a few months before her 40th birthday, Flic Manning knew something was seriously wrong. What began as occasional pain and bloating had become more persistent, and she had difficulty eating.

Advertisement

After two decades managing the inflammatory condition Crohn’s disease, the author and radio host was used to gastrointestinal symptoms, but this felt different.

Her GP told her it was probably just Crohn’s but referred her to a gastroenterologist to be safe. That specialist dismissed her concerns. “He didn’t even bother to touch my abdomen,” says Manning. “He said it was nothing to worry about.”

Still clearly worried, Manning turned to ChatGPT next, which suggested she had a possible twisted bowel and intussusception (where part of the intestine slides into another) – conditions that can be life-threatening. Concerned, Manning went in search of a third opinion, this time from her human gynaecologist.

“As soon as my gyno touched the hard area she was alarmed,” says Manning. “She said I need a CT, like, today!” Sure enough, the scan confirmed the chatbot’s prognosis. Manning had both intussusception and a twisted bowel, requiring multiple surgeries.

Advertisement

“[ChatGPT] even told me the most likely course of treatment, which also turned out to be correct,” she says. Manning is one of 400 million weekly users who have embraced ChatGPT, OpenAI’s generative artificial intelligence model.

Launched in 2022, it’s used globally to write emails, and create presentations, résumés and letters. But it also answers questions, compiles research and creates art – and increasingly people are relying on it for medical advice.

It’s not surprising; after all, we’ve been turning to Dr Google for decades. The difference is that with AI models like ChatGPT, the experience more closely resembles what you might get from a caring doctor, complete with sympathetic preamble (“I’m sorry you’re not feeling well”) before suggested solutions – though you may also get the disclaimer: “I’m not a doctor so I can’t give medical advice.”

There are many reasons people are going online to seek prompt medical advice, including delays getting to see a GP, the increasing cost when you do (up by just over 4 per cent in the past year), fewer doctors who bulk-bill, and years-long waitlists for specialists.

Advertisement

Skyrocketing living costs, an ageing population and rising chronic disease rates are overwhelming an already strained system. On top of that, there’s a dire shortage of full-time GPs, which is predicted to double by 2033, and more than 70 per cent of GPs are experiencing burnout, surveys have shown.

Add the fact that women bear the brunt of healthcare inequities – from medical gaslighting to underfunded research – and it’s easy to see the appeal of opening a new Chrome window in your own home.

“Globally, women spend 25 per cent more of their life in poor health compared to men,” says CEO and founder of Ovum AI Dr Ariella Heffernan-Marks, who views AI as a promising remedy to the diagnostic and treatment gaps women face.

chatgpt health advice
Advertisement

“It takes five years on average for any Australian woman to be diagnosed with a general condition, and it’s seven to 12 years for endometriosis. Women are coming in saying, ‘I’ve got extreme pain,’ and they’re being told, ‘You’re anxious. You’re just over-emotional.’ On top of that, women are experiencing a gender pay gap, so they need to see healthcare providers more often but [can’t] afford it.”

So, are we headed for an AI healthcare revolution, or falling down a dangerous rabbit hole? The experts are divided: it’s either a bit of both or too early to say.

The good news is that where Dr Google is hit and miss, ChatGPT is an excellent young medico, having passed the US medical licensing exam with flying colours, according to researchers at Harvard. Another study found it had a 92 per cent diagnostic accuracy rate. However, when interpreting MRIs, ChatGPT at times performed worse than radiologists, and in devising cancer treatments, one-third contained errors, according to studies.

There’s also the way ChatGPT presents the information it scrapes from the internet. On a first read, it sounds pretty good, but look closer and it’s often gibberish. Dr Piers Howe, an associate professor at the University of Melbourne, says AI “often produces nonsense, but it presents it so well it looks plausible, and that’s the real danger”.

Advertisement

That also concerns Dr Grant Blashki, a GP, associate professor at the University of Melbourne, and editor of Artificial Intelligence, For Better or Worse. He’s excited by the potential AI has to improve the patient experience, but stresses the need to always check with a doctor. “Patients can be overconfident [with AI advice] and then delay getting care,” he says.

Unfortunately, AI is also perpetuating the gender bias women face in traditional healthcare. “It’s assessing global data sets which typically contain gender bias, so it’s providing either gender or culturally biased responses,” says Herffernan-Marks.

But one place AI is having an undeniably positive impact is in the hands of physicians, where it can potentially streamline diagnosis, treatment, testing and admin. A UK study found one in five doctors are already using it for admin or help with diagnosis.

“It’s extraordinary, the pace of uptake,” says Blashki, who uses it to augment care, not replace it, “but doctors need to remember the buck stops with them.”

Advertisement

Heffernan-Marks is realistic about the limits. “AI is not diagnostic and … should be used to help you gather and understand information,” she says. But it does offer something valuable to women who’ve been brushed off: “a non-judgemental space” in which to prepare to advocate for themselves.

In our struggling healthcare system, perhaps AI can alleviate ills, rather than cure them. As the bot itself warns, it’s user beware. Like any technology, the risk is not in the machine, but how we use it

Related stories


Advertisement
Advertisement