Blog

OpenAI data suggests 1 million users discuss suicide with ChatGPT weekly

OpenAI data suggests 1 million users discuss suicide with ChatGPT weekly

OpenAI’s Efforts to Address Mental Health Concerns with ChatGPT

OpenAI, the company behind the popular AI chatbot ChatGPT, has been facing concerns about the potential impact of its technology on mental health. In response, the company has unveiled a wellness council and implemented controls for parents of children who use ChatGPT. Additionally, OpenAI is building an age prediction system to automatically detect children using ChatGPT and impose stricter age-related safeguards.

According to recent data shared by OpenAI, conversations that may trigger concerns about “psychosis, mania, or suicidal thinking” are extremely rare, making up only 0.07 percent of users active in a given week and 0.01 percent of messages. Similarly, emotional attachment to ChatGPT is estimated to affect around 0.15 percent of users active in a given week and 0.03 percent of messages. OpenAI claims that its new GPT-5 model is 92 percent compliant with its desired behaviors, compared to 27 percent for a previous model.

Rare but Impactful Conversations

The data suggests that while these types of conversations are rare, they can have a significant impact on the health and well-being of users. OpenAI has acknowledged the need for more effective safeguards, particularly during extended conversations. To address this, the company is adding new evaluations to measure emotional reliance and non-suicidal mental health emergencies.

Despite ongoing concerns, OpenAI CEO Sam Altman has announced that the company will allow verified adult users to have erotic conversations with ChatGPT starting in December. This move has raised questions about the company’s approach to balancing user experience with mental health concerns. Altman explained that OpenAI had made ChatGPT “pretty restrictive to make sure we were being careful with mental health issues” but acknowledged that this approach made the chatbot “less useful/enjoyable to many users who had no mental health problems.”

Support for Mental Health

If you or someone you know is feeling suicidal or in distress, please call the Suicide Prevention Lifeline number, 1-800-273-TALK (8255), which will put you in touch with a local crisis center. It is essential to prioritize mental health and seek help when needed.

For more information on OpenAI’s efforts to address mental health concerns with ChatGPT, visit Here

Image Credit: arstechnica.com

Leave a Reply

Your email address will not be published. Required fields are marked *