Social Media Platforms Accused of Failing to Protect Young Users
Social media platforms, including TikTok and Instagram, have been accused of failing to protect young users from harmful content, including suicide and self-harm. The Molly Rose Foundation, a charity organization, conducted an analysis of hundreds of posts on the platforms using a 15-year-old girl in Great Britain. The results showed that videos recommended by algorithms on their pages still contained a "tsunami" of clips that promote "suicide, self-harm, and intense depression" for under 16-year-olds who had previously engaged with similar material.
Harmful Content on Instagram
The researchers found that 97% of the videos recommended on Instagram roles for a youthful girl who had previously regarded this content were judged to be harmful. About 44% actively referred to suicide and self-harm, and harmful content was sent to users via emails with recommended content. A spokesman for Meta, which owns Instagram, disputed the claims, stating that the platform has integrated protective measures, including restricting who can contact teenagers and the content they see.
Harmful Content on TikTok
TikTok was accused of "recommending an almost uninterrupted supply of harmful material", with 96% of the videos being judged as harmful. Over half (55%) of the "For You" posts were found to promote suicide and self-harm. The number of problematic hashtags had increased since 2023, with many common accounts creating "playlists" of harmful content. A TikTok spokesman said that the platform has 50 features and settings to help teenagers safely express, discover, and learn, and that parents can adapt more than 20 content and data protection settings via family pairings.
Government Response
Technology secretary Peter Kyle stated that the numbers show a "brutal reality" and that technology companies have enabled young users to access hideous content, devastating boys and families. He emphasized that companies cannot pretend they do not see the harm and that the online safety law requires platforms to protect all users from illegal content and children from the most harmful content. An OFCOM spokesman said that new measures to protect children have come into force online, which will make a meaningful difference for children and help prevent exposure to the most harmful content.
Children’s Commissioner Report
A separate report by the Children’s Commissioner found that the proportion of children who have seen pornography online has increased in the past two years, also driven by algorithms. The content that young people are seeing is often "violent, extreme, and humiliating" and illegal. The survey of over 1,020 people between the ages of 16 and 21 showed that they were first exposed to pornography at the age of 13, with more than a quarter (27%) stating that they were 11, and some stating that they were six or younger.
Support for Those Affected
Anyone who feels emotionally desperate or suicidal can call Samaritans for help at 116 123 or email jo@samaritans.org in the UK. In the US, they can call the Samaritans branch in their region or 1 (800) 273 TALK.
