The Data Protection Commission (DPC) of Ireland has imposed a fine of approximately $367 million (EUR 345 million) on TikTok due to its mishandling of children’s data. In 2021, an investigation was initiated to assess TikTok’s compliance with the General Data Protection Regulation (GDPR) laws in Europe. The Irish data regulatory body, responsible for overseeing the app throughout the European Union, discovered multiple instances of GDPR violations by TikTok. These violations included setting children’s accounts to public by default, allowing adults to enable direct messaging for users over the age of 16, and inadequate consideration of the risks to users under the age of 13 on the platform.
According to the investigation’s findings, youngsters aged 13 to 17 were guided through the registration process in a manner that automatically set their accounts to a public visibility setting. This means that anyone could access and comment on the content within these accounts.
Furthermore, the “family pairing” feature, designed to enable adults to manage a child’s account settings, did not include a verification process to confirm whether the adult user was genuinely the parent or guardian of the child account holder. TikTok’s “Family Pairing” links children’s accounts with those of adults to facilitate the management of app settings. The DPC’s investigation revealed that unverified adult profiles could be linked, potentially resulting in the exchange of direct messages.
Doubts have arisen regarding TikTok’s effectiveness in preventing children under the age of 13 from accessing the platform. While the age verification methods were deemed to be in compliance with GDPR, there were deficiencies in safeguarding the privacy of underage users.
The Data Protection Commission (DPC) expressed disapproval of TikTok’s previous practice of automatically setting underage users’ accounts to a default public setting, permitting anyone to view their content. Features like Duet and Stitch were also automatically activated for users under the age of 17. TikTok has been given a three-month period to adhere to the new regulations. It’s important to note that no GDPR violations were identified in the methods used to verify users’ ages.
Previous fines on TikTok
In April, the UK data regulator imposed a fine of GBP 12.7 million on TikTok for unlawfully processing the data of 1.4 million children under the age of 13 who had used its platform without parental consent.
TikTok responded, stating, “We respectfully disagree with the decision, especially with regards to the magnitude of the fine levied. The criticisms from the Data Protection Commission (DPC) are primarily directed at features and settings that were in place three years ago, and which we had already modified well before the commencement of the investigation. For instance, we had already implemented the practice of setting all accounts for users under the age of 16 to private by default.”
Starting in 2021, TikTok made it a standard practice to set both new and existing accounts for individuals aged 13 to 15 to a private setting by default. This means that only individuals approved by the user can access their content. These changes were introduced to address the concerns raised during the investigation.