Introduction to the Issue
A new report has found that TikTok has been directing 13-year-olds to pornographic content. According to an examination by the British non-profit Global Witness, global witnesses were aimed at sexually explicit content by proposed search terms, as technology companies are exposed to increasing pressure to act against age checks.
The Examination by Global Witness
As part of the examination, Global Witness released a report on October 3, stating that it set up 7 new TikTok accounts in Great Britain that pretended to be 13 years old. The minimum age required for setting up an account is 13, and these accounts were created on factory phones without search history. Global Witness said that "Restricted mode" limits the fight against content that may not be suitable for everyone, including "sexually suggestive content", according to TikTok’s support page.
The Report’s Findings
The report takes place under a broader push both in the United Kingdom and in the USA to better protect children online, and as TikTok is against allegations in lawsuits that were submitted last year that the mental health of young users is harmful. When CNN turned through the report, a TikTok spokesman said that the company was determined to keep the experience of its users safe. The spokesman explained that as soon as they were made aware of these claims, they took immediate measures to examine them, remove them that violated their guidelines, and improvements in their search proposal feature.
TikTok’s Response
The explanation also states that TikTok is "fully committed to having safe and age-appropriate experiences", and it eliminates "9 out of 10 hurtful videos before they are ever displayed". However, the sexualized search queries were recommended: "The first time that the user clicked in the search bar" for three of the global test accounts that were created in accordance with the report. TikTok appeared to show pornographic content on all seven test users, "only a small number of clicks after setting up the account".
Global Witness’s Point
Global Witness said that "Our point is not just that TikTok shows underage pornographic content. It is so that TikTok’s search algorithms actively urge minors towards pornographic content". TikTok’s Community guidelines prohibit content that contains nudity, sexual activities, and sexual services, as well as content that contains sexually suggestive actions and a significant body burden with young people.
The Online Safety Law
The report takes place according to additional rules from the online safety law in Great Britain in connection with the security of children, which have come into force at the end of July. The media lawyer Mark Stephens said in the global reporting report that the results "represent a clear violation of the law". The online safety law 2023 is a number of laws that are to be promoted against the security of the Internet by enforcing new rules, according to which technology companies have to regulate certain types of content.
Criticisms and Concerns
Critics of the action, like the Electronic Frontier Foundation, said that the age review rules could endanger the privacy of all ages. Global Witness said that the law is fully applied to TikTok and other online platforms, and they carried out additional tests after this date. TikTok approaches online Safety Act Compliance "with a solid series of protective measures", said the spokesman, after it has been regulated by the regulatory authority of the British communication services since 2020.
TikTok’s Security Measures
TikTok has introduced security measures for young people in recent years. TikTok is one of many tech giants that are exposed to additional pressure to better protect children online. For example, YouTube introduced a system that uses artificial intelligence in August to appreciate the age of a user and, if necessary, switch on the age-specific protection. Instagram implemented teen account settings that automatically make teen accounts private last year. TikTok eliminates around 6 million minors worldwide by using various age detection methods, including technology, to see when an account of a child under the age of 13 can be used.
