European regulators have imposed a fine of 345 million euros (approximately 367 million dollars) on TikTok for its failure to protect children’s privacy, this marks the first time the short-video sharing app has been penalized for violating strict data privacy rules in Europe.
The Irish Data Protection Commission, the primary privacy regulatory authority for major tech companies headquartered in Europe, disclosed that it imposed a financial penalty on TikTok, accompanied by a reprimand, due to violations dating back to the second half of 2020.
The fine comes following an investigation initiated by the Irish Data Protection Commission in 2021, which examined TikTok’s compliance with the General Data Protection Regulation (GDPR) in Europe.
In August, Politico reported that the Irish Data Protection Commission was preparing to issue its penalty, with the investigation focusing on certain aspects of TikTok, including default account settings, family pairing settings, and age verification.
After consulting with the European Data Protection Board, the Irish Data Protection Commission found that TikTok made children’s accounts public by default when they registered on the platform, this meant that children’s videos were viewable to the public by default, and features like comments, Duets, and Stitch were enabled by default.
Family pairing, a feature introduced by the platform in 2020, allowed children’s accounts to be linked to a separate adult account to manage app settings, such as screen time limits and restrictions on direct messaging and inappropriate content.
However, the Irish Data Protection Commission determined that the family pairing feature was not stringent enough, as it allowed TikTok accounts for children to be linked to accounts that the company hadn’t verified as belonging to a parent or guardian, once linked, adult users could relax the child’s account settings to enable direct messaging.
The decision found that TikTok’s age verification methods did not violate GDPR, but the Irish Data Protection Commission decided that the company did not adequately protect the privacy of children under the age of 13 who managed to register for an account.
In 2021, the platform enhanced privacy settings for users aged 13 to 15, making them more private by default.
TikTok has expressed its disagreement with the decision, particularly the level of the imposed fine, the company pointed out that the regulator’s criticisms focused on features and settings dating back three years.
TikTok stated, “We made changes long ago to make all accounts for users under 16 private by default and disable direct messaging for those aged 13 to 15.”