The US Department of Justice (DoJ), in collaboration with the Federal Trade Commission (FTC), has filed a lawsuit against the popular video-sharing platform TikTok, alleging that it has “egregiously violated” the country’s children’s privacy laws.
Authorities alleged that the company knowingly allowed children to create TikTok accounts and watch and share short videos and messages with adults and other users on the service.
The lawsuit also accused the company of violating the Children’s Online Privacy Protection Act (COPPA) by illegally collecting and storing a variety of personal information from children without parental notice or consent.
It added that TikTok’s actions also violated a 2019 consent order between the company and the government, in which the company promised to notify parents before collecting children’s data and to remove videos of users under the age of 13.
COPPA requires online platforms to collect, use, and disclose personal information from children under the age of 13 unless they obtain parental consent. It also requires companies to delete all collected information upon a parent’s request.
“Defendants also unlawfully collected and retained email addresses and other personal information of children through accounts created in ‘Kids Mode’ – a simplified version of TikTok aimed at children under the age of 13,” the Justice Department said.
“Furthermore, when parents discovered their children’s accounts and requested that Defendants delete the accounts and the information contained therein, Defendants often did not honor those requests.”
The complaint further alleges that the ByteDance-owned company subjected millions of children under the age of 13 to pervasive data collection, enabling targeted advertising and allowing children to interact with adults and access adult content.
The lawsuit also accused TikTok of not paying enough attention during the account creation process and building backdoors that allow kids to get around its age gate, designed to verify that someone is under 13, by signing in using third-party services like Google or Instagram and classifying such accounts as “age unknown” accounts.
“TikTok’s human reviewers allegedly spend an average of just five to seven seconds reviewing each account to determine whether it belongs to a child,” the FTC said, adding that it would take steps to protect children’s privacy from companies deploying “sophisticated digital tools to monitor children and profit from their data.”
TikTok has more than 170 million active users in the United States. The company denies the allegations, but it’s the latest setback for the video platform. The app is subject to a law that would force it to be divested or banned by early 2025 over national security concerns. The company has filed a petition in federal court asking for the ban to be overturned.
“We disagree with these allegations, many of which are untrue or relate to past events or practices that we have already addressed,” TikTok said. “We have rigorous safeguards in place to provide an age-appropriate experience, proactively remove users we suspect are underage, and voluntarily implemented features like default screen time limits, family pairing and additional privacy protections for minors.”
The social media platform has also come under increased scrutiny globally over child protection: European Union regulators fined TikTok €345 million in September 2023 for breaching data protection law over its handling of children’s data. In April 2023, it was fined £12.7 million by the ICO for unlawfully processing the data of 1.4 million children under the age of 13 who used the platform without parental consent.
The lawsuit comes after the UK Information Commissioner’s Office (ICO) revealed it had called on 11 media and video-sharing platforms to improve their children’s privacy practices or risk facing enforcement action. The breached services were not named.
“We have asked 11 of the 34 platforms about issues around default privacy settings, location and age assurance, and have asked them to explain how their practices comply with (child protection laws),” the company said. “We have also engaged some of the platforms in discussions about targeted advertising and made clear our expectations for changes to ensure their practices are in line with both the law and the law.”