TikTok removed nearly 7.3 million accounts suspected to belong to under-age children in the first quarter of this year.
The video-sharing platform said the profiles accounted for fewer than 1% of global users. Children aged 13 and over are allowed to use the platform, which is highly popular with teenagers.
This is the first time TikTok has published such figures in a Community Guidelines Enforcement Report. It said it hoped the detail about under-age users will “help the industry push forward when it comes to transparency and accountability around user safety”.
TikTok emphasised that it has introduced several measures to protect teenagers on the platform, including limiting features like private messaging and live-streaming to users aged 16 and over.
Those under the age of 16 will also have their accounts automatically set to private – a feature introduced in January this year.
“To bring more visibility to the actions we take to protect minors, in this report we added the number of accounts removed for potentially belonging to an under-age person,” Cormac Keenan, head of trust and safety at TikTok, said.
The Chinese app is popular with teenagers but concerns about under-age users are on the rise. Data that was leaked to the New York Times last year suggested about a third of US users were aged 14 and under.
“TikTok has some of the tech world’s most sophisticated computer vision technology, and with it, probably has the ability to spot with decent accuracy under-age users. But using such technology would require a lot of permissions that people may feel queasy about.”
Like all social networks, the company is forever toeing the line between attracting teenagers – the future of all platforms – and making sure they’re not too young.
The firm wouldn’t say how exactly it “knows” when a user is under 13, but the takedown number is impressive and I understand it’s a mixture of automatic machine processes and – interestingly – time-consuming and costly human moderation too.