ChatGPT says it may start alerting authorities when suicide is mentioned

Image (c) ConsumerAffairs. OpenAI's CEO warns of a potential policy shift to alert authorities about young users discussing suicide with ChatGPT.

CEO Sam Altman: “It keeps me awake at night”

  • CEO Sam Altman warns as many as 1,500 people a week may be at risk
  • Policy shift comes after lawsuit over teen’s death linked to chatbot
  • OpenAI weighs policy change on suicide risk

The company behind ChatGPT could start contacting authorities when young people discuss suicide in conversations with the chatbot, co-founder Sam Altman has said.

In an interview this week, Altman raised fears that as many as 1,500 people globally may be talking about taking their own lives with ChatGPT each week before going on to do so. He admitted the policy change was not yet final but called it “very reasonable” to alert authorities in cases involving minors where parents could not be reached.

Altman’s comments came during a podcast interview with Tucker Carlson, days after he and OpenAI were sued by the family of 16-year-old Adam Raine from California, according to The Guardian

The lawsuit alleges ChatGPT “encouraged” the teenager over several months, advising him on whether his chosen method would work and helping him draft a farewell note. Raine died by suicide in April.

Balancing privacy with safety

Altman acknowledged the proposed move would mark a major shift in policy for the San Francisco-based firm, which has more than 700 million global users. “User privacy is really important,” he said, noting that ChatGPT currently only urges people expressing suicidal thoughts to contact hotlines.

He added it was unclear which authorities could be notified, or what user information OpenAI might share to help locate someone at risk.

Stronger safeguards for teens

Following one highly publicized death, OpenAI said it would introduce parental controls and tougher guardrails around “sensitive content and risky behaviours” for under-18s.

Altman also suggested restricting people in fragile mental states from “gaming the system” by pretending they are asking suicide-related questions for research or creative writing. “We should say, even if you’re trying to write the story or even if you’re trying to do medical research, we’re just not going to answer,” he said.

A global crisis

Altman cited figures suggesting 15,000 people die by suicide every week worldwide, which would equate to around 1,500 ChatGPT users based on its share of global population. The World Health Organization estimates more than 720,000 people take their own lives each year.

A spokesperson for OpenAI pointed to recent pledges to improve one-click access to emergency services and connect users to certified therapists before a crisis point.


If you need immediate support:

  • US: 988 Suicide & Crisis Lifeline, call or text 988, or chat at 988lifeline.org

  • UK & Ireland: Samaritans, 116 123 (freephone), jo@samaritans.org or jo@samaritans.ie

  • US: 988 Suicide & Crisis Lifeline, call or text 988, or chat at 988lifeline.org

  • Australia: Lifeline, 13 11 14

  • International: befrienders.org


Stay informed

Sign up for The Daily Consumer

Get the latest on recalls, scams, lawsuits, and more

    By entering your email, you agree to sign up for consumer news, tips and giveaways from ConsumerAffairs. Unsubscribe at any time.

    Thanks for subscribing.

    You have successfully subscribed to our newsletter! Enjoy reading our tips and recommendations.

    Was this article helpful?

    Share your experience about ConsumerAffairs

    Was this article helpful?

    Share your experience about ConsumerAffairs