Search

Shopping cart

Saved articles

You have not yet added any article to your bookmarks!

Browse articles
Newsletter image

Subscribe to the Newsletter

Join 10k+ people to get notified about new posts, news and tips.

Do not worry we don't spam!

GDPR Compliance

We use cookies to ensure you get the best experience on our website. By continuing to use our site, you accept our use of cookies, Privacy Policy, and Terms of Service.

OpenAI reveals scale of people talking to ChatGPT about suicide

An estimated 1.2 million people a week have conversations with ChatGPT that indicate they are planning to take their own lives.

The figure comes from its parent company OpenAI, which revealed 0.15% of users send messages including "explicit indicators of potential suicide planning or intent". Earlier this month, the company's chief executive Sam Altman estimated that ChatGPT now has more than 800 million weekly active users.

While the tech giant does aim to direct vulnerable people to crisis helplines, it admitted "in some rare cases, the model may not behave as intended in these sensitive situations". OpenAI evaluated over 1,000 "challenging self-harm and suicide conversations" with its latest model GPT-5 and found it was compliant with "desired behaviours" 91% of the time.

But this would potentially mean that tens of thousands of people are being exposed to AI content that could exacerbate mental health problems. The company has previously warned that safeguards designed to protect users can be weakened in longer conversations - and work is under way to address this.

"ChatGPT may correctly point to a suicide hotline when someone first mentions intent, but after many messages over a long period of time, it might eventually offer an answer that goes against our safeguards," OpenAI explained. OpenAI's blog post added: "Mental health symptoms and emotional distress are universally present in human societies, and an increasing user base means that some portion of ChatGPT conversations include these situations." Read more from Sky News:King heckled over Prince AndrewBritish airline suspends operations A grieving family is currently in the process of suing OpenAI - and allege ChatGPT was to blame for their 16-year-old boy's death.

Adam Raine's parents claim the tool "actively helped him explore suicide methods" and offered to draft a note to his relatives. Court filings suggest that, hours before he died, the teenager uploaded a photo that appeared to show his suicide plan - and when he asked whether it would work, ChatGPT offered to help him "upgrade" it.

Last week, the Raines updated their lawsuit and accused OpenAI of weakening the safeguards to prevent self-harm in the weeks before his death in April this year. In a statement, the company said: "Our deepest sympathies are with the Raine family for their unthinkable loss.

Teen wellbeing is a top priority for us - minors deserve strong protections, especially in sensitive moments." Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK..

Prev Article
Tech Innovations Reshaping the Retail Landscape: AI Payments
Next Article
The Rise of AI-Powered Personal Assistants: How They Manage

Related to this topic:

Comments

By - Tnews 28 Oct 2025 5 Mins Read
Email : 3

Related Post