403
 Sorry!!
 
Error! We're sorry, but the page you were looking for doesn't exist.
 TikTok Floods French Teens with Harmful Content, Amnesty Reveals
(MENAFN) A recent investigation by Amnesty International reveals that TikTok’s algorithm in France is aggressively pushing young users into harmful content related to depression and suicide, contravening EU regulations aimed at safeguarding children online.
The report, titled “Dragged into the Rabbit Hole,” highlights how the platform’s “For You” feed rapidly inundates 13-year-olds with distressing videos, creating what Amnesty terms a “toxic cycle” of mental health-related content.
Researchers from Amnesty established three teenage test profiles in France to observe TikTok’s recommendation engine when users show interest in mental health topics. Within five minutes, the accounts encountered videos centered on sadness and hopelessness. By the 15-minute mark, depressive videos accounted for half the feed, and within 45 minutes, two accounts were exposed to content referencing suicidal thoughts.
“Our research shows how quickly TikTok can draw vulnerable teenagers into a spiral of harmful content,” said Lisa Dittmer, Amnesty’s specialist in children’s and young people’s digital rights. “The platform’s design amplifies distress instead of protecting users.”
In collaboration with the Algorithmic Transparency Institute, the study also revealed that engaging with sad or depressive videos caused TikTok’s algorithm to more than double similar content recommendations, intensifying exposure.
Amnesty asserts TikTok’s practices violate the EU’s Digital Services Act (DSA), which mandates that social media companies assess and mitigate systemic risks to children’s rights. The report calls on the European Commission to enforce “binding and effective measures” to safeguard minors on the platform.
The report incorporates powerful testimonials from French teenagers and grieving families who say TikTok normalizes self-harm and suicidal ideation.
“There are videos still burned into my memory,” said Maelle, 18, who recounted how exposure to self-harm content on TikTok worsened her mental health. “Seeing people who cut themselves or explain what medication to take, it influences and encourages you to harm yourself.”
Stephanie Mistre, whose 15-year-old daughter Marie died by suicide in 2021, condemned TikTok’s algorithms for turning children into “products.”
“They use our children as products, capturing their emotions to keep them online,” she told Amnesty. “This intrusion into children’s private lives is unacceptable. Children have rights.”
Since being regulated under the DSA in 2023, TikTok faces growing pressure from European regulators over its management of harmful content. Despite new safety measures introduced in 2024, Amnesty maintains the platform continues exposing minors to content that glamorizes despair and self-harm.
“TikTok’s disregard for systemic harms linked to its engagement-driven model raises serious compliance concerns,” said Katia Roux, advocacy officer at Amnesty France. “The Commission must act decisively to protect vulnerable users.”
 The report, titled “Dragged into the Rabbit Hole,” highlights how the platform’s “For You” feed rapidly inundates 13-year-olds with distressing videos, creating what Amnesty terms a “toxic cycle” of mental health-related content.
Researchers from Amnesty established three teenage test profiles in France to observe TikTok’s recommendation engine when users show interest in mental health topics. Within five minutes, the accounts encountered videos centered on sadness and hopelessness. By the 15-minute mark, depressive videos accounted for half the feed, and within 45 minutes, two accounts were exposed to content referencing suicidal thoughts.
“Our research shows how quickly TikTok can draw vulnerable teenagers into a spiral of harmful content,” said Lisa Dittmer, Amnesty’s specialist in children’s and young people’s digital rights. “The platform’s design amplifies distress instead of protecting users.”
In collaboration with the Algorithmic Transparency Institute, the study also revealed that engaging with sad or depressive videos caused TikTok’s algorithm to more than double similar content recommendations, intensifying exposure.
Amnesty asserts TikTok’s practices violate the EU’s Digital Services Act (DSA), which mandates that social media companies assess and mitigate systemic risks to children’s rights. The report calls on the European Commission to enforce “binding and effective measures” to safeguard minors on the platform.
The report incorporates powerful testimonials from French teenagers and grieving families who say TikTok normalizes self-harm and suicidal ideation.
“There are videos still burned into my memory,” said Maelle, 18, who recounted how exposure to self-harm content on TikTok worsened her mental health. “Seeing people who cut themselves or explain what medication to take, it influences and encourages you to harm yourself.”
Stephanie Mistre, whose 15-year-old daughter Marie died by suicide in 2021, condemned TikTok’s algorithms for turning children into “products.”
“They use our children as products, capturing their emotions to keep them online,” she told Amnesty. “This intrusion into children’s private lives is unacceptable. Children have rights.”
Since being regulated under the DSA in 2023, TikTok faces growing pressure from European regulators over its management of harmful content. Despite new safety measures introduced in 2024, Amnesty maintains the platform continues exposing minors to content that glamorizes despair and self-harm.
“TikTok’s disregard for systemic harms linked to its engagement-driven model raises serious compliance concerns,” said Katia Roux, advocacy officer at Amnesty France. “The Commission must act decisively to protect vulnerable users.”
   Legal Disclaimer:
 MENAFN provides the
              information “as is” without warranty of any kind. We do not accept
              any responsibility or liability for the accuracy, content, images,
              videos, licenses, completeness, legality, or reliability of the information
              contained in this article. If you have any complaints or copyright
              issues related to this article, kindly contact the provider above.

 
                
                
                
                
                
                
    
                       
                       
                       
                       
                       
                       
                       
                       
Comments
No comment