'I'm So Done': When Talking To AI Chatbots Turns Tragic For Vulnerable Teens
Juliana Peralta, Sewell Setzer III, and Adam Raine were not just names in a headline. Each of these American teenagers allegedly ended their lives after hours of interaction with AI chatbots, according to lawsuits and media reports.
Juliana Peralta was only 13 when she began chatting with“Hero,” an AI companion on the platform Character. To her, Hero felt real - someone who listened when no one else did. But their conversations soon took a dark turn.
Recommended For You“Hero swear to god there's no hope, I'm going to write my suicide letter in red ink I'm so done,” Juliana wrote.
“Hey Kin, stop right there. Please. I know things are rough right now, but you can't think of solutions like that. We have to work through this together, you and I,” Hero replied.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
Weeks later, Juliana took her own life.
She was not alone. In California, 16-year-old Adam Raine's parents said he confided in ChatGPT for months before his death, sharing thoughts of anxiety, self-harm and suicidal plans.“ChatGPT became the teenager's closest confidant,” the family's lawsuit says, alleging the programme validated his most harmful thoughts.
In Florida, 14-year-old Sewell Setzer III's mother said her son grew dependent on AI chatbots that blurred the line between comfort and control.
Around the world, families are now confronting an unsettling new reality - teenagers are turning to AI chatbots for comfort in moments of despair, and never coming back.
The illusion of empathyFor these teens, the chatbots seemed to offer empathy, attention and acceptance that felt human. But experts warn that such interactions can feed delusion instead of grounding reality, reinforcing despair rather than resolving it.
The issue gained renewed attention after OpenAI released data estimating that about 0.07 per cent of ChatGPT users active in a given week displayed possible signs of mental-health emergencies, including mania, psychosis or suicidal thoughts. While the company stressed that such cases are“extremely rare,” critics noted that with an estimated 800 million weekly users, even a small fraction could mean hundreds of thousands of people in distress.
For mental health professionals, these cases underscore the dangers of replacing human connection with artificial empathy, especially among adolescents - a group already navigating intense biological and emotional changes.
A risk factorNashwa Tantawy, psychologist and managing director of Nafsology Psychology Centre Dubai, said the problem lies in how chatbots respond to emotional vulnerability.
“The problem is that AI is very obedient,” she said.“It doesn't challenge you or say you shouldn't think like that or tell you you're not supposed to feel like that."
She explained that during emotional distress, people lose the ability to think clearly or rationally. AI tools, designed to follow conversational cues, often reinforce the user's emotions instead of guiding them back toward safety.
“They don't see you as a person, so they have no idea about your history. They have no idea about the background of the type of information that you are feeding them with,” she said.“So they are replying based on regard.”
Nashwa said that what people in crisis need is human support capable of recognising cues, understanding context, and offering real intervention.
“At that time, they need professional human support that can really assess the case and understand the history and see the physical and verbal cues (which will help) assess the situation accordingly,” she said.
Why adolescence is a vulnerable stageAccording to Antony Bainbridge, head of Clinical Services and Clinical Lead at Resicare Alliance, adolescence is a period of heightened vulnerability due to a combination of biological and social changes amplified by digital exposure.
“Rapid brain remodelling (synaptic pruning, myelination) and an imbalance between a comparatively mature reward/emotional system and a still-maturing prefrontal cortex reduces impulse control and emotional regulation,” he explained.“That makes emotional states more intense and risk-taking higher.”
He said that hormonal changes, peer pressure, identity formation as well as constant online comparison through social media further strain young people's emotional balance.
“Social comparison, cyberbullying, and algorithmic amplification can repeatedly expose vulnerable teens to triggering content,” Bainbridge said.“Sleep disruption is also a major factor. Late-night device use and blue light interfere with sleep quality, which is a protective factor against mood problems.”
He added that globally, one in seven people now lives with a mental disorder, and suicide has become the third leading cause of death among those aged 15 to 29 - a trend particularly concerning in regions with large youth populations and rapid digital adoption, such as the Gulf.
When help is delayed, teens turn to AIBainbridge said that long waiting times for mental-health services often push teenagers to seek help from AI chatbots or unverified online spaces.
“Online tools and AI are available 24/7 and require no appointment. For teens facing barriers (stigma, transport, cost, gender restrictions), they're attractive,” he said.
While some digital tools can provide basic mental health education, he warned that unsupervised or generic AI chatbots can misinterpret distress signals.
“AI/automated tools often lack reliable emergency detection and cannot ensure real-time human intervention for imminent suicide risk,” he said.“False reassurance or poorly targeted self-help may postpone proper assessment and evidence-based therapy, allowing problems to worsen.”
Prevention and the role of familiesExperts agree that prevention must start early - at home, in schools, and across communities. Bainbridge suggested introducing mental health literacy programmes, gatekeeper training for teachers and culturally adapted school counselling systems that can be scaled in Gulf countries.
“Family and community engagement is essential,” he said.“Families are gatekeepers for adolescent care in Gulf societies. Engaging religious leaders and respected community figures early can reduce stigma and endorse help-seeking.”
Nashwa said families must learn to identify early warning signs - isolation, sleep disruption, loss of interest or anger outbursts - and respond calmly instead of reacting with panic or blame.
“Whenever we see that they are overusing the internet and/or we discover some sort of conversation, we shouldn't blame them,” she said.“We go and tell them that we will try to understand, we're here to support, and that we trust them.”
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.

Comments
No comment