TikTok is making a trio of changes to its mobile app that will make the video-watching experience a tad safer for its young audience. Last year, the U.K.’s broadcasting and telecommuting regulator, The Office of Communications (Ofcom), released new rules requiring sites like TikTok to protect under-18s on their platforms. While there is a public want for it in the U.S, no official ruling has been made. Some of the changes coming were announced by TikTok a while ago, but they are not exactly unique in their approach.
For example, in July last year, Instagram introduced a Sensitive Content Control system that allows users to specify the level of sensitive content that they want to appear in the Explore feed. Divided across two tiers – Limit and Limit Even More – the idea is to let users choose the level of content sensitivity that they are comfortable with watching. Facing hot criticism from lawmakers, parents and child safety organizations, TikTok also promised concrete changes to how it handles and pushes content to its young audience in February this year and has finally revealed the work it has done so far.
Related: What Is TikTok's Mental Age Test & How Do You Take It?
The first in line is a new behind-the-scenes categorization system that will sort videos based on what TikTok calls "Thematic Maturity." The ByteDance-owned social media platform hasn't shared any granular details on the metrics dictating the categorization but only says it is similar to what the audience is used to seeing with movies, TV shows and games. In the coming weeks, the company will further boost it with an automated system that will block "content with overtly mature themes" from appearing for users aged between 13 and 17 years. Aside from
Read more on screenrant.com