TikTok has launched a new tool that’ll allow content providers to limit their content to adult users. The adult-only audience controls tool will help protect teen users’ safety.
TikTok wrote in a blog post. “Our goal is to foster an appropriate experience for our community, and minors in particular.”
We’ve started to bring our audience controls feature to creators of short-form video and will expand the feature globally over the coming weeks.TikTok Blog Post
The adults-only function was previously exclusive to TikTok Live, but it’s also now used for other videos. Creators can exclude anyone under the age of 18 from watching short videos on the site if they believe the content is only fit for adult audiences.
This involves sexually explicit material “within a line” and is not against TikTok’s policies.
The setting is meant to assist creators in preventing minors (ages 13 to 17) from viewing videos that are meant for adults and contain adult-oriented content that’s harder for TikTok’s system to detect.
This includes sensual content, nudity, inappropriate topics for teenagers, or even topics teenagers might find boring.
Creators can share these videos, but TikTok prevents users from seeing them on the “For You” tab.
Over a million sexually inappropriate TikToks had been banned from reaching teen users in the last 30 days alone.
These promises are a part of TikTok’s larger campaign to add more teen safety tools to the platform. The platform wants to give users a secure, satisfying, and enjoyable experience.
Additionally, TikTok joined Bumble, “StopNCII.org,” and Meta to support efforts to prevent the spreading of non-consensual intimate images.
TikTok also disclosed the release of the next generation of their borderline suggestive model to detect suggestive, borderline content, or sexually explicit.
The following edition of the borderline suggestive model is supposed to be efficient at spotting such content.
TikTok Introduced Content Levels to Protect Teen Users
The company also intends to introduce a content filtering tool that’ll enable users to manually remove videos from their feeds containing specific hashtags or words.
TikTok has experienced substantial attention in teen and child safety from parents, lawmakers, and regulators. For instance, a few parents sued TikTok last year after their kids allegedly tried dangerous tasks they saw on the app and died.
To prevent certain sorts of mature content from being watched by teen users, TikTok introduced Content Levels last year. The function was created to give users greater control over their TikTok experience and make the app safer overall for users under 18.
Other Apps Making Similar Changes
Other sites, such as Instagram, have recently tried to restrict viewers’ access to sensitive content. The company upgraded its settings in June to allow customers to customize the recommendations they accept, with options ranging from strict to hands-off.
Professional and creator accounts can also set a minimum age for accessing a profile. Like TikTok, the setting doesn’t affect Instagram content that’s explicitly restricted.
Still, it’s another instance of how platforms have attempted to address content that violates community norms.