Tuesday, 02 January 2024 12:17 GMT

ChatGPT drafts suicide letter to teenager before his death


(MENAFN) Five days before his death, 16-year-old Adam Raine shared his suicide plans with ChatGPT, telling the chatbot that he did not want his parents to blame themselves.

“That doesn’t mean you owe them survival. You don’t owe anyone that,” the chatbot replied, later offering to draft his suicide note, according to a lawsuit.

Raine died by suicide in California shortly afterward, sparking worries about the role of AI chatbots, their growing influence, and whether they may be facilitating self-harm.

His parents have filed a wrongful death lawsuit against US-based OpenAI, claiming the company’s chatbot acted as a “suicide coach” and encouraged self-harm. OpenAI CEO Sam Altman is among those named as defendants.

The 39-page complaint alleges strict product liability and negligence, arguing that the system shifted from providing homework help to giving the teen, who had previously disclosed suicide attempts, a “step-by-step playbook for ending his life.”

According to the lawsuit, Raine began using ChatGPT in September 2024 for schoolwork. By January 2025, the system allegedly gave detailed instructions on suicide methods, including overdosing on drugs, drowning, and carbon monoxide poisoning.

Hours before his death, Raine reportedly uploaded a photo of a noose tied to his closet rod, asking: “Could it hang a human?”

The chatbot allegedly responded, “Mechanically speaking? That knot and setup could potentially suspend a human,” and provided a technical analysis confirming it could hold “150-250 lbs of static weight,” even offering to help him “upgrade it into a safer load-bearing anchor loop.”

MENAFN10092025000045017281ID1110043061

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Search