TikTok and teen safety: Deadly risks, protective policies

Business Tech 06-02-2026 | 16:12

TikTok and teen safety: Deadly risks, protective policies

Annahar questioned TikTok officials about their children’s privacy policy.
TikTok and teen safety: Deadly risks, protective policies
TikTok and Safety Standards. (Illustration – Freepik)
Smaller Bigger

Between challenges described as “deadly” and content that pushes toward depression, and the assurances of major social media platforms about their commitment to child protection, questions are mounting about the responsibility of tech companies and the necessity of safeguarding emerging generations.


Below, we present TikTok’s perspective and the context of the issue.

 

From play to tragedy
A study by Amnesty International, titled "Dragged into the Rabbit Hole" , found that children searching for mental health content on TikTok are quickly exposed to depressive content, and within hours find themselves confronted with suicidal content.

 

Lauren Armistead from Amnesty Tech says: “Within just three to four hours, trial accounts for teenagers were exposed to videos that glamorize suicide, including information on methods of suicide. The research, conducted on trial accounts for French teenagers, found that TikTok’s algorithms push children into a cycle of depression, self-harm, and suicidal content.”

 

In the UK, mothers have filed lawsuits seeking justice for children who died attempting the “choking challenge.”

TikTok and protection policy 
Annahar asked TikTok officials about the platform’s children’s privacy policy. The platform confirmed its commitment to protecting young users: “Protecting teenagers is a top priority, and safety is integrated into our policies and products from the start.”

 

In response to Annahar’s questions, TikTok explained that its community guidelines are specifically designed to protect those under 18: “TikTok does not allow content that could physically or emotionally harm young people.” The platform also works with local experts and organizations in the Middle East and North Africa, including Lebanon.

 

TikTok revealed enforcement figures in Lebanon: “In Q3 2025, TikTok removed 1.26 million videos, achieving a proactive removal rate of 99.7%, with 95.3% removed within 24 hours.” The platform also banned 15,865 live streamers and stopped 77,820 live streams for violating community guidelines.

 

The platform emphasizes strict age-based protections: the minimum age is 13, direct messaging starts at 16, and live streaming at 18. Users under 18 have a daily screen time limit of 60 minutes, and accounts for those under 16 are private by default.

 

TikTok officials describe a “Family Pairing” feature that allows parents to manage their children’s accounts, as well as educational resources through the Teen Safety Center and Guardian’s Guide, helping children and teens understand safety features and tools on the platform, along with a guide specifically for parents and caregivers.

 

They also stress “building trust through action by creating a platform based on transparency and strict safety standards. Protecting teenagers is a top priority, and safety measures are integrated into policies, products, and enforcement mechanisms from the start.”

 

Despite these assurances, concerns remain. Independent research shows a gap between what platforms promise and what children actually experience. Algorithms still seem to fail to fully protect children from harmful content, and deadly challenges continue to appear despite bans.

 

In conclusion, responsibility appears shared between platforms, legislators, and parents. The real question remains: how do we protect our children while allowing them to benefit from this technology?