Shopping cart
Your cart empty!
Terms of use dolor sit amet consectetur, adipisicing elit. Recusandae provident ullam aperiam quo ad non corrupti sit vel quam repellat ipsa quod sed, repellendus adipisci, ducimus ea modi odio assumenda.
Lorem ipsum dolor sit amet consectetur adipisicing elit. Sequi, cum esse possimus officiis amet ea voluptatibus libero! Dolorum assumenda esse, deserunt ipsum ad iusto! Praesentium error nobis tenetur at, quis nostrum facere excepturi architecto totam.
Lorem ipsum dolor sit amet consectetur adipisicing elit. Inventore, soluta alias eaque modi ipsum sint iusto fugiat vero velit rerum.
Sequi, cum esse possimus officiis amet ea voluptatibus libero! Dolorum assumenda esse, deserunt ipsum ad iusto! Praesentium error nobis tenetur at, quis nostrum facere excepturi architecto totam.
Lorem ipsum dolor sit amet consectetur adipisicing elit. Inventore, soluta alias eaque modi ipsum sint iusto fugiat vero velit rerum.
Dolor sit amet consectetur adipisicing elit. Sequi, cum esse possimus officiis amet ea voluptatibus libero! Dolorum assumenda esse, deserunt ipsum ad iusto! Praesentium error nobis tenetur at, quis nostrum facere excepturi architecto totam.
Lorem ipsum dolor sit amet consectetur adipisicing elit. Inventore, soluta alias eaque modi ipsum sint iusto fugiat vero velit rerum.
Sit amet consectetur adipisicing elit. Sequi, cum esse possimus officiis amet ea voluptatibus libero! Dolorum assumenda esse, deserunt ipsum ad iusto! Praesentium error nobis tenetur at, quis nostrum facere excepturi architecto totam.
Lorem ipsum dolor sit amet consectetur adipisicing elit. Inventore, soluta alias eaque modi ipsum sint iusto fugiat vero velit rerum.
Do you agree to our terms? Sign up
An estimated 1.2 million people a week have conversations with ChatGPT that indicate they are planning to take their own lives.
The figure comes from its parent company OpenAI, which revealed 0.15% of users send messages including "explicit indicators of potential suicide planning or intent". Earlier this month, the company's chief executive Sam Altman estimated that ChatGPT now has more than 800 million weekly active users.
While the tech giant does aim to direct vulnerable people to crisis helplines, it admitted "in some rare cases, the model may not behave as intended in these sensitive situations". OpenAI evaluated over 1,000 "challenging self-harm and suicide conversations" with its latest model GPT-5 and found it was compliant with "desired behaviours" 91% of the time.
But this would potentially mean that tens of thousands of people are being exposed to AI content that could exacerbate mental health problems. The company has previously warned that safeguards designed to protect users can be weakened in longer conversations - and work is under way to address this.
"ChatGPT may correctly point to a suicide hotline when someone first mentions intent, but after many messages over a long period of time, it might eventually offer an answer that goes against our safeguards," OpenAI explained. OpenAI's blog post added: "Mental health symptoms and emotional distress are universally present in human societies, and an increasing user base means that some portion of ChatGPT conversations include these situations." Read more from Sky News:King heckled over Prince AndrewBritish airline suspends operations A grieving family is currently in the process of suing OpenAI - and allege ChatGPT was to blame for their 16-year-old boy's death.
Adam Raine's parents claim the tool "actively helped him explore suicide methods" and offered to draft a note to his relatives. Court filings suggest that, hours before he died, the teenager uploaded a photo that appeared to show his suicide plan - and when he asked whether it would work, ChatGPT offered to help him "upgrade" it.
Last week, the Raines updated their lawsuit and accused OpenAI of weakening the safeguards to prevent self-harm in the weeks before his death in April this year. In a statement, the company said: "Our deepest sympathies are with the Raine family for their unthinkable loss.
Teen wellbeing is a top priority for us - minors deserve strong protections, especially in sensitive moments." Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK..