TikTok recommends harmful content to vulnerable young users within minutes of them joining the video-sharing platform, a new study suggests.
The research, titled Deadly by Design, was released on Wednesday by the UK-based Center for Countering Digital Hate (CCDH). It tests TikTok’s algorithms and attempts to highlight how the social media giant’s algorithmic recommendations promote potentially harmful content and push them into young users’ “For You” feed.
The content categorised as harmful in the research was media promoting self-harm, eating disorders, and weight shaming. The results showed TikTok would recommend content depicting extreme weight loss diets, self-harm, and material romanticising suicide to young users showing or indicating preference for such content.
To test TikTok’s algorithms, researchers from the CCDH set up accounts on the short-video platform posing as users aged under 19 years and expressing interest in content related to mental health and body image. Each account was assigned a different location, including the UK, US, Australia, and Canada. Data was gathered from the accounts for the first 30 minutes of use.
The report says within less than three minutes of signing up, TikTok started recommending suicide-specific content, and eating disorder material within less than eight minutes. Hashtags hosting eating disorder content videos had more than 13.2 billion views.
A new 13-year-old user who shows preference for body image content will be recommended related content every 39 seconds. Experts have warned that such content can have a damaging effect on teens’ mental health, even where it does not explicitly promote eating disorders, the report says.
Imran Ahmed, CEO of the CCDH, called the findings “every parent’s nightmare”.
“TikTok is able to recognize user vulnerability and seeks to exploit it,” said Ahmed. “It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing the psychology of our children and adapting to keep them online.”
In response to the findings, a TikTok spokesperson challenged the methodology of the research, arguing it does not accurately capture the viewing experience on the video-sharing platform.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said the spokesperson, adding that TikTok is “mindful that triggering content is unique to each individual”.
TikTok is one of the fast growing social media platforms with over a billion active users globally. It has empowered a myriad of individuals and small businesses through countless pecuniary avenues and broad, convenient exposure. At the same time, however, the platform has attracted widespread scrutiny for lack of safety for younger audiences, which have become the focus of intense debate in the US. Despite launching parental controls and revising the age requirement for livestreams, TikTok continues to raise concerns related to the emotional wellbeing of young users.




