How TikTok’s new measures will protect underage users

TikTok is one of the fastest-growing apps on social media, with an estimated 100 million users across the US only each month. However, despite being hugely popular amongst young people, the platform has recently faced criticism for failing to protect its underage users.

According to a recent Community Guidelines Enforcement Report, TikTok removed almost 7.3m accounts belonging to suspected underage children in the first three months of 2021. 62 million videos were also taken down for violating community standards, including for hateful content and harassment for underage users.

The new figures come after a recent BBC Panorama investigation revealed the TikTok’s failure to ban a flagged child predator from the site following a 14-year-old user reporting a male adult for sending sexual messages. The app was also recently under investigation by the Information Commissioner’s Office in the UK for how it collects and uses children’s personal information.

TikTok already has implemented several measures in place designed to keep younger users safe, including restricting direct messaging and live streams to over 16s. However, growing fears of online grooming have prompted the platform to strengthen its security policy even further.

What are the new measures?

Firstly, users aged under 16 will have their accounts automatically set to private. This means that those aged between 13 and 15 will have to approve friends’ comments, and strangers will no longer be able to comment on their videos. This setting means users choose whether to make their videos public, and their accounts won’t be suggested to other users on the app.

Secondly, the platform will stop users from downloading content created by under-16s, however, users will have the option to turn off this restriction.

Finally, TikTok is also changing the default settings for under-16s, so that only their chosen friends can duet alongside them. This means that users will not be able to duet with clips made by under-16s.

Whilst full age verification is also being discussed as a potential solution, this could pose a challenge to tech firms and customers. For example, it is unlikely that users would be willing to hand over passport details to tech giants if this measure was implemented.

TikTok hopes that the changes will encourage users to take a more active role in deciding who sees their videos and will help them make more informed decisions about what they post online.

Interested in finding out more about how to protect children online? Read some of our other articles on the topic: