Britain’s data protection authority on Tuesday issued a $15.9 million fine to TikTok, the popular video-sharing app, saying the platform had failed to abide by data protection rules intended to safeguard children online.
The Information Commissioner’s Office said TikTok had inappropriately allowed up to 1.4 million children under the age of 13 to use the service in 2020, violating British data protection rules that require parental consent for organizations to use children’s personal information. TikTok failed to obtain that consent, regulators said, even though it should have been aware that younger children were using the service.
The British investigation found that the video-sharing app did not do enough to identify underage users or remove them from the platform, even though TikTok had rules barring children under 13 from creating accounts. TikTok failed to take adequate action, regulators said, even after some senior employees at the video-sharing platform raised concerns internally about underage children using the app.
TikTok, which is owned by the Chinese internet giant ByteDance, has also faced scrutiny in the United States. Last month, members of Congress questioned its chief executive, Shou Chew, about possible national security risks posed by the platform.
The TikTok privacy fine underscores mounting public concerns about the mental health and safety risks that popular social networks may pose for some children and adolescents. Last year, researchers reported that TikTok began recommending content tied to eating disorders and self-harm to 13-year-old users within 30 minutes of their joining the platform.
In a statement, John Edwards, Britain’s information commissioner, said TikTok’s practices could have put children at risk.
“An estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data,” Mr. Edwards said in the statement. “That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.”
In a statement, TikTok said it disagreed with the regulators’ findings and was reviewing the case and considering next steps.
“TikTok is a platform for users aged 13 and over,” the company said in the statement. “We invest heavily to help keep under 13s off the platform, and our 40,000-strong safety team works around the clock to help keep the platform safe for our community.”
This is not the first time that regulators have cited the popular video-sharing app over children’s privacy concerns. In 2019, Musical.ly, the operator of the platform now known as TikTok, agreed to pay $5.7 million to settle charges by the Federal Trade Commission that it had violated children’s online privacy protection rules in the United States.
Since then, legislators in the United States and Europe have put in place new rules to try to bolster protections for children online.
In March, Utah passed a sweeping law that would prohibit social media platforms like TikTok and Instagram from allowing minors in the state to have accounts without parental consent. Last fall, California passed a law that would require many social media, video game and other apps to turn on the highest privacy settings — and turn off potentially risky features like friend-finders that allow adult strangers to contact children — by default for minors.