Tech giant faces scrutiny amid allegations underage users can stay by claiming parental supervision.
TikTok under scrutiny for child user safeguards following a Guardian probe revealing moderators instructed to permit under-13s by claiming parental oversight. An instance showed a 12-year-old user, below TikTok’s minimum age, allowed due to the profile indicating parental management.
During the fall, an internal message featured a quality analyst, tasked with addressing moderation-related queries in video queues. A moderator sought guidance on whether to ban the user’s account.
The TikTok quality analyst advised moderators to permit accounts if the bio indicated parental management. This guidance was shared in a group chat with over 70 moderators handling content primarily from Europe, the Middle East, and Africa.
There are also claims that moderators were informed in meetings that accounts with a parent in the background or a bio stating parental management could be retained on the platform. Suspected underage accounts are directed to an “underage” queue for additional review, where moderators can choose to either ban, resulting in account removal, or approve, allowing the account to remain on the platform.
The code mandates services under its jurisdiction to adopt a risk-based approach in determining the age of users and to effectively apply the outlined standards to child users.
Beeban Kidron, the code’s architect and a crossbench peer, expressed dismay at the possibility of underage children continuing to use a service after the platform is made aware of their age. She noted concerns about profit-driven design and moderation choices potentially endangering children.
TikTok falls under the regulation of Ofcom in the UK, governed by video-sharing platform rules now integrated into the Online Safety Act. The new act compels tech platforms to outline preventive measures against underage access in their terms of service, which all users agree to, and consistently implement those terms.
Within the EU, the Digital Services Act safeguards children, compelling major platforms like TikTok to implement protective measures, such as parental controls and age verification, to shield them from harmful content. The act also mandates platforms to ascertain a user’s age with a high degree of certainty, prohibiting tech firms from utilizing under-18s’ data for targeted advertising.