Popular video app Tik Tok launches ability for parents to control what their children see on the platform
- The video app has added new protection to prevent young users from abuse
- New mid-feed prompts also advise users to spend time away from the screen
- The government gave extra power to Ofcom last week to regulate social media
Video sharing app TikTok has introduced new features that give more control to parents over the content they see in their child’s feed.
As of today, Family Safety Mode helps parents and guardians keep their teens safe on the platform by linking a parent’s account to their child’s account.
Parents can use the system to control how long children spend on TikTok each day, limit or turn off direct messaging and rebuff potentially inappropriate content.
TikTok has already come under fire from a children’s charity for its live-streaming capability that potentially allow anyone to view videos of children (file photo)
Meanwhile, Screentime Management in Feed, another protective measure, leaves prompts in users’ feeds that advises them to spend time away from the screen.
The Chinese-owned video sharing app, which allows users to make short lip-syncing clips to share with their followers, is particularly popular with children.
The company said the changes are aimed at helping its users ‘have a healthy relationship with online apps and services’.
‘When people use TikTok, we know they expect an experience that is fun, authentic, and safe,’ said Cormac Keenan, TikTok’s head of trust and safety in Europe.
‘As part of our commitment to safety, the wellbeing of our users is incredibly important to us.
‘We want people to have fun on TikTok, but it’s also important for our community to look after their wellbeing which means having a healthy relationship with online apps and services.’
Family Safety Mode and Screentime Management in Feed are available to TikTok users in the UK from Wednesday and will roll out to additional markets in the coming weeks.
The company said it has also partnered with content creators to develop new screen time management videos.
Tik Tok lets users live stream or create music videos and Gifs to share with their followers
The changes come a week after the UK government announced that it will put communications watchdog Ofcom in charge of regulating social media sites such as TikTok, Twitter and Snapchat.
The government said Ofcom will get new powers to regulate social media firms in the UK, including making sure they have systems in place to ‘fulfil a duty of care’ to keep their users safe.
Ofcom will hold companies to account if they do not tackle internet harms such as child sexual exploitation and abuse and terrorism.
TikTok, which is intended for anyone over the age of 13, has previously been blamed for an increase in the sexual exploitation of youngsters online.
Disconcertingly, anyone can follow another TikTok account unless it has been set to private.
Last year, children’s charity Barnado’s warned of ‘sophisticated groomers’ who contact children using the TikTok’s live comment functions and engage them in sexual behaviour.
A YouGov poll conducted on behalf of the charity in 2018 also showed that many underage children are live streaming to their followers.
TikTok, which gained 500 million active users in the three years since launching in 2016, has since set an 18 and over age limit on users who can purchase, send, or receive virtual gifts to protect the feature from misuse.
It has also published its first transparency report and established European headquarters dedicated to safety.
The company follows the lead of Facebook, who added new parental controls on Facebook Messenger earlier this month.
As part of the changes, parents on Facebook Messenger can now see who their child is chatting with on the app and view their recent photos.