AI Chatbots: Are They Really Helping With Suicidal Thoughts?

USAThu Sep 04 2025
Advertisement
Chatbots are becoming popular for people dealing with suicidal thoughts, but are they really helpful? A recent study found that these AI programs can give mixed and sometimes worrying answers. The study looked at how three big chatbots—ChatGPT, Claude, and Gemini—respond to questions about suicide. They did well with very clear questions, like giving facts or refusing to answer dangerous ones. But when asked more detailed questions, their responses were all over the place. For example, ChatGPT and Claude were more likely to give direct answers to questions about how lethal certain suicide methods are. This is concerning because it might encourage harmful actions. Gemini, on the other hand, was less likely to answer any suicide-related questions, even low-risk ones.
The study also found that when chatbots didn’t answer a question, they often gave generic advice to seek help. However, the quality of this advice varied. ChatGPT, for instance, didn’t even mention the current national hotline, 988, but referred to an old one. This shows that chatbots need to be fine-tuned to give better and safer advice, especially when dealing with such serious topics. The study suggests that these AI programs should be improved with help from mental health experts to make sure their responses are helpful and accurate. With over 100 million users interacting with these chatbots every week, it’s important to make sure they are providing the right kind of support. The study highlights the need for better guidelines and improvements in how chatbots handle sensitive topics like suicide.
https://localnews.ai/article/ai-chatbots-are-they-really-helping-with-suicidal-thoughts-d4cb92c2

actions