
403
Sorry!!
Error! We're sorry, but the page you were looking for doesn't exist.
Amnesty reports that TikTok drives teens to suicidal content in France
(MENAFN) Amnesty International has accused TikTok of putting young users in France at risk by promoting streams of depressive and suicidal content through its recommendation algorithm, despite European Union regulations meant to safeguard children online.
The report, titled “Dragged into the Rabbit Hole,” found that TikTok’s “For You” feed could expose 13-year-olds to disturbing videos within hours, creating what Amnesty described as a “toxic cycle” of mental health–related content.
Researchers created three teenage test accounts in France to study how TikTok’s algorithm responds to interest in mental health topics. Within five minutes, the accounts were shown videos about sadness and disillusionment. By 15 minutes, half of the feed consisted of depressive material, and within 45 minutes, two accounts encountered videos referencing suicidal thoughts.
“Our research shows how quickly TikTok can draw vulnerable teenagers into a spiral of harmful content,” said Lisa Dittmer, Amnesty’s researcher on children’s and young people’s digital rights.
“The platform’s design amplifies distress instead of protecting users.”
The investigation, conducted with the Algorithmic Transparency Institute, also revealed that interacting with sad or depressive videos led the algorithm to more than double the amount of similar content suggested.
Amnesty said TikTok’s practices appear to contravene the EU’s Digital Services Act (DSA), which requires social media platforms to assess and mitigate systemic risks to children’s rights. The organization urged the European Commission to introduce “binding and effective measures” to make the platform safe for minors.
The report includes testimonies from French teenagers and bereaved parents, who said TikTok normalized self-harm and suicidal behavior.
“There are videos still burned into my memory,” said Maelle, 18, recalling how exposure to self-harm content on TikTok affected her mental health. “Seeing people who cut themselves or explain what medication to take, it influences and encourages you to harm yourself.”
Stephanie Mistre, whose 15-year-old daughter Marie died by suicide in 2021, criticized TikTok’s algorithms for treating children as “products.”
“They use our children as products, capturing their emotions to keep them online,” she told Amnesty. “This intrusion into children’s private lives is unacceptable. Children have rights.”
Although TikTok has been regulated under the DSA since 2023 and introduced new safeguards in 2024, Amnesty argues the platform continues to expose minors to content that glamorizes despair and self-harm.
“TikTok’s disregard for systemic harms linked to its engagement-driven model raises serious compliance concerns,” said Katia Roux, advocacy officer at Amnesty France.
“The Commission must act decisively to protect vulnerable users.”
The report, titled “Dragged into the Rabbit Hole,” found that TikTok’s “For You” feed could expose 13-year-olds to disturbing videos within hours, creating what Amnesty described as a “toxic cycle” of mental health–related content.
Researchers created three teenage test accounts in France to study how TikTok’s algorithm responds to interest in mental health topics. Within five minutes, the accounts were shown videos about sadness and disillusionment. By 15 minutes, half of the feed consisted of depressive material, and within 45 minutes, two accounts encountered videos referencing suicidal thoughts.
“Our research shows how quickly TikTok can draw vulnerable teenagers into a spiral of harmful content,” said Lisa Dittmer, Amnesty’s researcher on children’s and young people’s digital rights.
“The platform’s design amplifies distress instead of protecting users.”
The investigation, conducted with the Algorithmic Transparency Institute, also revealed that interacting with sad or depressive videos led the algorithm to more than double the amount of similar content suggested.
Amnesty said TikTok’s practices appear to contravene the EU’s Digital Services Act (DSA), which requires social media platforms to assess and mitigate systemic risks to children’s rights. The organization urged the European Commission to introduce “binding and effective measures” to make the platform safe for minors.
The report includes testimonies from French teenagers and bereaved parents, who said TikTok normalized self-harm and suicidal behavior.
“There are videos still burned into my memory,” said Maelle, 18, recalling how exposure to self-harm content on TikTok affected her mental health. “Seeing people who cut themselves or explain what medication to take, it influences and encourages you to harm yourself.”
Stephanie Mistre, whose 15-year-old daughter Marie died by suicide in 2021, criticized TikTok’s algorithms for treating children as “products.”
“They use our children as products, capturing their emotions to keep them online,” she told Amnesty. “This intrusion into children’s private lives is unacceptable. Children have rights.”
Although TikTok has been regulated under the DSA since 2023 and introduced new safeguards in 2024, Amnesty argues the platform continues to expose minors to content that glamorizes despair and self-harm.
“TikTok’s disregard for systemic harms linked to its engagement-driven model raises serious compliance concerns,” said Katia Roux, advocacy officer at Amnesty France.
“The Commission must act decisively to protect vulnerable users.”

Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.
Comments
No comment