ByteDance Ltd.’s TikTok app is displayed in the App Store on a smartphone in an arranged photograph taken in Arlington, Virginia, on Monday, Aug. 3, 2020.
Andrew Harrer | Bloomberg | Getty Images
LONDON —TikTok has pushed a new safety update to its app that allows parents to take more control of their teenager’s accounts, as social media companies come under increasing pressure to make their platforms safer for children.
The company said Tuesday that its “Family Pairing” feature now enables parents to turn off comments on their children’s videos entirely or permit them for friends only. Parents whose accounts are linked to their kids will also be able to set their accounts to private, turn off the search function for content, users, hashtags or sounds, and limit who can see which videos their children have liked.
Launched in March, the Family Pairing tool lets parents see how long their children are spending on TikTok each day and restrict the content they see on their feed.
TikTok allows children to sign up and create an account if they’re over the age of 13. All they need to do is provide their date of birth. However, because TikTok doesn’t ask for verification, some children under 13 have signed up by lying about their age, according to U.K. regulator Ofcom.
Alexandra Evans, head of child safety public policy for TikTok in Europe, told CNBC that Family Pairing has “struck a chord with parents” since it launched.
“If we’re thinking about it as a toolbox, we wanted to offer more tools,” she said on a video call ahead of the announcement.
The new Family Pairing features, which go live worldwide on Tuesday, provide youngsters with a “guardrail” as they discover TikTok, Evans said.
“The updates we are making today are the latest in a series of steps we have taken to give families the tools they need to create the TikTok experience that’s right for them,” said Evans. “We know that when people feel safe, they feel free to express their creativity — that’s why safety is at the heart of everything we do.”
In April, TikTok banned under 16s from sending direct messages on its platform, becoming the first social media firm to block private messaging by teens on a worldwide scale.
Social media consultant Matt Navarra told CNBC that some of TikTok’s younger users will “hate the power this gives their parents over their TikTok lives,” but might not be wholly surprised.
“The risks to their mental health and general online safety are frequently in the news, and other platforms have slowly started to add more features to give parents more control and awareness of time spent on devices by their children,” said Navarra.
Navarra, who uses TikTok’s Family Pairing feature with his own daughter, said he believes it will be challenging for TikTok to address the needs of parents without damaging the TikTok experience for its core users: young people. He suspects many youngsters already circumvent the parental controls by creating separate accounts their parents don’t know about.
Social media platforms like TikTok, Facebook and Twitter have gained a reputation for being potentially dangerous places for children to spend time.
Carolyn Bunting, chief executive of nonprofit Internet Matters, said it was clear that social media companies needed to do more to ensure their platforms were safe spaces for young people.
“Ultimately, it’s often parents who have to balance their children’s safety with their enjoyment, and we know … that children with parents who are engaged with what they’re doing online, are safer online,” she said in a statement.
While parental tools on social media can be useful, Bunting said it was important that parents find time to have regular conversations with their children about the platforms.