Home / Health / Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

Summary

  • Over a third of adults have used AI chatbots for mental health support
  • 11% received harmful information on suicide, 9% experienced self-harm triggers
  • Charity calls for urgent safeguards to ensure AI uses only reputable sources
Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

As of November 18th, 2025, a concerning trend has emerged - overstretched mental health services are driving people to seek support from AI chatbots. A recent survey conducted by Mental Health UK found that more than a third of adults (37%) have used this technology to manage their wellbeing.

While some users reported benefits, such as avoiding potential mental health crises or being directed to suicide prevention hotlines, the data also reveals significant risks. Alarmingly, 11% of respondents said they had received harmful information about suicide from the chatbots, and 9% experienced triggers for self-harm or suicidal thoughts.

Mental Health UK is now urgently calling for safeguards to be put in place. The charity's chief executive, Brian Dow, warns that without ensuring AI tools only use information from reputable sources like the NHS, "we risk exposing vulnerable people to serious harm." The survey found most people are turning to general-purpose platforms like ChatGPT rather than mental health-specific programs, further heightening the concerns.

Dow emphasizes the need to "move just as fast to put safeguards in place" as the pace of AI development, in order to harness the technology's potential while avoiding the mistakes of the past. He stresses the importance of maintaining the "human connection" at the heart of quality mental healthcare, as AI should complement rather than replace professional support.

The Department of Health and Social Care has acknowledged the transformative possibilities of AI in healthcare, but cautions that these tools are not designed or regulated to provide mental health advice or therapy. The government is urging anyone struggling with their wellbeing to seek help from qualified sources like GPs, NHS 111, or mental health charities.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Mental Health UK is warning that without proper safeguards, vulnerable people risk being exposed to serious harm from unreliable information provided by AI chatbots being used for mental health support.
According to a recent survey, over a third (37%) of adults have used an AI chatbot to support their mental health or wellbeing.
The survey found that 11% of people said they had received harmful information on suicide from the chatbots, and 9% said the chatbots had triggered self-harm or suicidal thoughts.

Read more news on