According to Google’s internal data, almost 40% of Gen Z users prefer searching on TikTok and Instagram compared to Google Search and Maps. TikTok, which is inarguably the fastest-growing social media app, has seen its popularity explode in the last few years. This can be gauged from its competitors like Snapchat and Instagram rolling out copycat video features in Spotlight and Reels.
With almost 63% of Americans in the age group of 12-17 using TikTok, the social media platform has repeatedly come under fire for its failure to regulate the nature of content that it presents to its audience. The platform has had a long history of dishing out mature content, from trends involving lip-syncing sexual lyrics and challenging Christian fascist meme propaganda. In the last year, multiple harmful trends have found their way to the app, including the milk crate challenge, where users made videos of themselves trying to climb up pyramids of crates.
All this has prompted worried parents and activists to demand from TikTok to regulate the content that it shows to the users, especially to Gen Z TikTokers, who form a high percentage of its user base.
Well, it looks like TikTok has finally paid heed to these calls as it has announced that it will be rolling out an early version of a major update that will organize content by maturity level to limit the access of underage users to mature content.
How Will the Update Work?
The planned update will allow users to filter out specific words and hashtags that they don’t want to see on their homepage. Cormac Keenan, TikTok's Head of Trust and Safety, said in a blog post that as soon as TikTok finds a video with mature content, including scenes that may be considered too frightening or intense for the younger audiences, the platform would allocate a maturity score to the video. This maturity score would prevent under-18 users to view the video.
According to TikTok, its age guideline concept is similar to the maturity rating in movies and video games that restricts the access of certain audiences to a specific kind of content.
But Why Introduce the Update Now?
As stated previously, TikTok has come under fire repeatedly for turning a blind eye to the content that is served on its platform. The U.S. government is currently investigating how TikTok deals with issues of child sex abuse and predators misusing the privacy features on its platform. This has pushed multiple social media platforms to limit or monitor the content that they show to their younger audiences.
Will the Update Bring About Any Change in Content Consumption?
It’s highly unlikely that the introduction of the latest child safety update will have any impact on either the content consumption patterns or limiting the access of uncensored content by underage users. This is because TikTok doesn’t have a robust user verification system in place, which would allow users to simply circumvent the update by saying that they are older than they actually are when they create an account.
This would mean that the new features would depend largely on user judgment to distinguish between good and harmful content, thereby, putting their safety in their own hands.
Moreover, although new maturity ratings and expanded content filtering could serve as infrastructure to make the app more secure for kids, the announced features are highly unlikely to stop the scrutiny of TikTok’s impact on young users.
If you want to read more on the latest developments taking place in the social media space, take a look at ClickInsights’ Social Media Buzz, wherein we bring to you monthly reports on everything going on in social media, ranging from platform updates to policy changes that influence the way we market.
Comments