AI chatbots inconsistent on suicide-related queries: Study
A study by RAND Corporation found that AI chatbots like ChatGPT, Gemini and Claude don't always handle suicide-related questions the same way. They avoided answering direct, high-risk questions about suicide methods, but there was significant variability in responses to questions at intermediary levels. The chatbots, especially ChatGPT, showed reluctance to provide therapeutic resources.