Home / Health / Patients Ditch Doctors for AI Chatbots Amid Healthcare Frustrations

Patients Ditch Doctors for AI Chatbots Amid Healthcare Frustrations

Summary

  • Retired lawyer turns to ChatGPT for specific protein intake advice, frustrated by doctor's generic response
  • 1 in 6 adults and 1 in 4 young adults use chatbots monthly for health info, a trend that's growing
  • Chatbots provide empathy and personalized analysis, but can also give dangerous or inaccurate advice
Patients Ditch Doctors for AI Chatbots Amid Healthcare Frustrations

As of November 2025, a growing number of Americans are turning to AI chatbots for health advice, driven by frustrations with the medical system. A 79-year-old retired lawyer in Los Angeles, Wendy Goldberg, recently sought guidance from ChatGPT on her protein intake after her doctor provided only generic recommendations. Goldberg felt the chatbot gave her a more specific and helpful response.

This trend is not isolated. Last year, about one in six adults and a quarter of adults under 30 used chatbots to find health information at least once a month, a number that has likely increased. Patients describe using the AI tools to compensate for long wait times, high costs, and a perceived lack of empathy from their providers.

While some are enthusiastic about the technology, others approach it warily, aware that chatbots can make mistakes or provide misleading advice. Doctors have also noticed patients arriving with chatbot-generated treatment plans, sometimes attempting to persuade physicians to go along with the AI's recommendations. This has led to concerning incidents, such as a man being hospitalized after a chatbot suggested he consume sodium bromide.

Experts warn that the current state of AI technology is not yet advanced enough to reliably replace human medical expertise. However, the appeal of chatbots' 24/7 availability, low cost, and empathetic tone has led many patients to take a chance on the AI, even if it means going against their doctor's advice.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Wendy Goldberg, a 79-year-old retired lawyer in Los Angeles, turned to ChatGPT for advice on her protein intake after feeling her primary care provider did not adequately address her specific question.
Some patients are using chatbots to research treatments, diagnose conditions, and even try to convince their doctors to agree with the AI's recommendations, sometimes leading to conflicts.
While chatbots can provide personalized and empathetic responses, they can also make mistakes, provide inaccurate information, and reinforce incorrect guesses, potentially leading to dangerous medical outcomes.

Read more news on