New rules will make under-16s’ TikToks private
TIKTOK, the social media platform best known for usermade, short funny videos, is strengthening its anti-grooming rules by making accounts held by children under 16 ‘private’ by default.
It will also disallow public comments on TikTok videos made by those under 16, restricting them to ‘friends’ or ‘no-one’.
And downloads of videos made by those under 16 won’t be allowed unless the account-holding child manually switches permission on.
The changes will apply to existing accounts as well as newly-created ones.
Other rules changes include setting the ‘suggest your account to others’ feature to ‘off ’ by default for accounts of those under 16 and removing the ability for others to ‘Duet’ and ‘Stitch’ with content created by under-16s. For users aged 16 to 17, the default settings of Duet and Stitch will now be set to ‘friends’.
Tens of thousands of Irish children and younger teens use TikTok, which is now one of the world’s most popular social media platforms for children. Because of this, there are also heightened sensitivities of interaction between adults and minors on the forum.
“We know there is no finish line when it comes to minor safety, and that is why we are continuously evolving our policies and investing in our technology and human-moderation teams so that TikTok remains a safe place for all our users to express their creativity,” said Alexandra Evans, head of child safety for TikTok in Europe.
“The changes announced today build on previous changes we’ve made to promote minor safety, including restricting direct messaging and hosting live streams to accounts 16 and over and enabling parents and caregivers to set guardrails for their teen’s TikTok account through our Family Pairing feature.”
Last year, TikTok moved to restrict its direct messaging features to those over 16 years of age.
The Chinese-owned firm recently opened a ‘trust and safety hub’ office in Dublin, to serve the European market.
TikTok’s action comes as the Government prepares to introduce a new Online Safety Commissioner that will seek to interact with social media companies on issues including abuse.
New fines of “up to” 10pc of annual turnover will be introduced for violations of other requests from the Online Safety Commissioner, on issues such as cyber-bullying, self-harm, eating disorders and suicide.
Criminal liability for directors or executives of big tech firms who fail to comply with a warning from the Online Safety Commissioner will also be introduced.
This means that if Facebook, Google or other large online platforms based here are found to have breached the new regulator’s code or have ignored its directions, they could be looking at billions of euro in fines. Facebook Ireland’s turnover last year was €34bn.
However, the new Commissioner will not directly represent individual citizens to tech giants, instead liaising with the companies on general matters.
Nor will it tackle child abuse imagery in messaging services such as WhatsApp or on popular online storage services such as Dropbox.
Instead, policing child abuse imagery and non-consensual intimate photographs will remain an activity for the tech companies themselves, together with conventional support from An Garda Siochána.
‘We’re continuously evolving to make TikTok a safe space’