feedzop-word-mark-logo
searchLogin
Feedzop
homeFor YouUnited StatesUnited States
You
bookmarksYour BookmarkshashtagYour Topics
Trending
trending

FOMC minutes released Wednesday

trending

Exact Sciences stock hits high

trending

NASA finds weird Mars rock

trending

Nvidia stock soars after forecast

trending

Palo Alto Networks earnings beat

trending

AMD stock awaits Nvidia earnings

trending

Vince Gill Lifetime Achievement

trending

India: Cross-border data transfer rules

trending

EU botches AI regulation

Terms of UsePrivacy PolicyAboutJobsPartner With Us

© 2025 Advergame Technologies Pvt. Ltd. ("ATPL"). Gamezop ® & Quizzop ® are registered trademarks of ATPL.

Gamezop is a plug-and-play gaming platform that any app or website can integrate to bring casual gaming for its users. Gamezop also operates Quizzop, a quizzing platform, that digital products can add as a trivia section.

Over 5,000 products from more than 70 countries have integrated Gamezop and Quizzop. These include Amazon, Samsung Internet, Snap, Tata Play, AccuWeather, Paytm, Gulf News, and Branch.

Games and trivia increase user engagement significantly within all kinds of apps and websites, besides opening a new stream of advertising revenue. Gamezop and Quizzop take 30 minutes to integrate and can be used for free: both by the products integrating them and end users

Increase ad revenue and engagement on your app / website with games, quizzes, astrology, and cricket content. Visit: business.gamezop.com

Property Code: 5571

Home / Health / Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

18 Nov

•

Summary

  • Over a third of adults have used AI chatbots for mental health support
  • 11% received harmful information on suicide, 9% experienced self-harm triggers
  • Charity calls for urgent safeguards to ensure AI uses only reputable sources
Charity Warns of Risks as AI Chatbots Become Mental Health Lifeline

As of November 18th, 2025, a concerning trend has emerged - overstretched mental health services are driving people to seek support from AI chatbots. A recent survey conducted by Mental Health UK found that more than a third of adults (37%) have used this technology to manage their wellbeing.

While some users reported benefits, such as avoiding potential mental health crises or being directed to suicide prevention hotlines, the data also reveals significant risks. Alarmingly, 11% of respondents said they had received harmful information about suicide from the chatbots, and 9% experienced triggers for self-harm or suicidal thoughts.

Mental Health UK is now urgently calling for safeguards to be put in place. The charity's chief executive, Brian Dow, warns that without ensuring AI tools only use information from reputable sources like the NHS, "we risk exposing vulnerable people to serious harm." The survey found most people are turning to general-purpose platforms like ChatGPT rather than mental health-specific programs, further heightening the concerns.

Dow emphasizes the need to "move just as fast to put safeguards in place" as the pace of AI development, in order to harness the technology's potential while avoiding the mistakes of the past. He stresses the importance of maintaining the "human connection" at the heart of quality mental healthcare, as AI should complement rather than replace professional support.

The Department of Health and Social Care has acknowledged the transformative possibilities of AI in healthcare, but cautions that these tools are not designed or regulated to provide mental health advice or therapy. The government is urging anyone struggling with their wellbeing to seek help from qualified sources like GPs, NHS 111, or mental health charities.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Mental Health UK is warning that without proper safeguards, vulnerable people risk being exposed to serious harm from unreliable information provided by AI chatbots being used for mental health support.
According to a recent survey, over a third (37%) of adults have used an AI chatbot to support their mental health or wellbeing.
The survey found that 11% of people said they had received harmful information on suicide from the chatbots, and 9% said the chatbots had triggered self-harm or suicidal thoughts.

Read more news on

Healthside-arrow

You may also like

NHS Reform Echoes HS2 Failures, Risks Waiting List Pledge

1 day ago • 4 reads

article image

Charity Warns of Risks as 1 in 3 Adults Turn to AI for Mental Health Support

18 Nov • 7 reads

article image

NHS Offers Remote Prostate Cancer Care and Home Testing to Improve Early Detection

17 Nov • 11 reads

article image

AI to Assist UK Hospitals in Faster Fracture Detection

17 Nov • 12 reads

article image

Teens Outsmart Parents: Shocking Vape Hiding Spots Exposed

15 Nov • 17 reads

article image