TikTok Allegedly Directs Children's Profiles to Pornographic Content In Just a Few Taps
According to a new study, the widely-used social media app has been observed to guide profiles of minors to explicit material within a small number of clicks.
Research Methodology
Global Witness set up simulated profiles using a date of birth for a minor and activated the "restricted mode" setting, which is meant to reduce exposure to adult-oriented content.
Investigators observed that TikTok recommended sexualized and explicit search terms to seven test accounts that were established on unused smartphones with no previous activity.
Troubling Search Prompts
Search phrases suggested under the "suggested searches" feature featured "provocative attire" and "inappropriate female imagery" – and then escalated to terms such as "graphic sexual content".
For three of the accounts, the inappropriate search terms were proposed instantly.
Fast Track to Adult Material
Within minimal interaction, the study team came across explicit material including women flashing to penetrative sex.
Global Witness claimed that the content tried to bypass filters, often by presenting the content within an benign visual or video.
For one account, the method took two taps after logging on: one click on the search bar and then another on the proposed query.
Legal Framework
Global Witness, whose mandate includes examining digital platforms' effect on public safety, said it conducted multiple testing phases.
The first group occurred prior to the enforcement of child protection rules under the British online safety legislation on 25 July, and additional tests subsequent to the rules took effect.
Serious Findings
Researchers added that two videos showed someone who appeared to be under 16 years old and had been reported to the online safety group, which oversees harmful material involving minors.
Global Witness claimed that TikTok was in violation of the Online Safety Act, which obligates tech companies to block children from viewing dangerous material such as adult material.
Regulatory Response
A communications officer for Britain's media watchdog, which is responsible for regulating the act, stated: "We acknowledge the work behind this investigation and will review its findings."
Official requirements for adhering to the law specify that digital platforms that pose a medium or high risk of showing harmful content must "modify their programming" to block dangerous material from children's feeds.
The app's policies prohibit adult videos.
Company Reaction
The video platform stated that upon receiving information from the research group, it had removed the offending videos and made changes to its recommendation system.
"Upon learning of these assertions, we acted promptly to look into the matter, remove content that breached our guidelines, and introduce upgrades to our search suggestion feature," said a spokesperson.