TikTok is working to make it simpler for users to follow its rules and know what’s happening to their profiles. The business is releasing a modified “account enforcement system,” a set of tweaks that include a new strike system and tools that allow users to determine whether their work has been removed from the app’s suggestions.
TikTok’s global head of policy, Julie de Bailliencourt, stated in a blog post that the site was changing its content moderation policy, letting users and creators know what kinds of offenses they could have on record. With TikTok’s new approach, users will be penalized based on their actions, strengthening penalties and increasing user safety.
TikTok’s New Enforcement System
Users will get strikes if they comment, publish, or otherwise behave on the site in a way that goes against TikTok’s rules. Receiving a strike entails the removal of the offensive content but does not always mean a temporary or permanent suspension.
Our analysis has found that almost 90% violate using the same feature consistently, and over 75% violate the same policy category repeatedly.
There are various strike-through thresholds for posts, comments, and live streams. TikTok will still impose permanent bans for serious breaches, such as clips threatening or promoting violence, exhibiting or supporting CSAM or child sexual abuse material, or portraying real-world violence or torture. Users who accumulate enough strikes “will be permanently banned,” according to de Bailliencourt.
TikTok has been striving to reassure users of the platform’s security while addressing calls for a nationwide ban in the U.S.
The cumulative strikes will disappear from an account’s database after 90 days. Still, profiles that “accrue a high number of cumulative strikes over policies and features will be permanently blocked. TikTok did not specify what a “high number” would be or provide more details on the various thresholds.
The changes bring TikTok’s policy closer to that of its competitors. Both Meta and YouTube have strike systems in place to penalize accounts that violate its rules, but each platform has its own set of standards for assessing strikes and the consequences that come with them.
Making Transparent, Consistent Moderation System
The updates are a part of TikTok’s bigger effort to foster transparency around how it handles content moderation and algorithmic recommendations, both of which come under heavy scrutiny from regulators, legislators, and other critics. The redesigned account enforcement system is presently being rolled out internationally. TikTok has stated that it would inform all community members as soon as the new system is available.
TikTok will also start informing creators when their accounts are about to be terminated permanently.
TikTok will roll out additional tools in the Safety Center available to artists in-app in the coming weeks to further assist them. These are a “Report records” tab where creators may review the progress of comments they’ve made on other contents or accounts and an “Account status” tab where creators can quickly view the status of their account.
These new tools enhance the notifications creators currently receive if they breach policies, and they enhance creators’ capacity to challenge enforcement and have strikes removed if valid.
TikTok will keep developing and updating users on its platform with the techniques used to assess accounts and ensure accurate, nuanced enforcement choices for all accounts.
It has also introduced new transparency features that let users know why particular videos were recommended or forbidden. However, some negative news also follows every new announcement.
For instance, the business’s use of a secret heating option to make videos go viral was discovered last month. Just before that, Forbes said TikTok had spied on its journalists. These revelations have severely damaged the company’s reputation at a time when it is attempting to build trust.