TikTok has introduced a new notification system when content from creators violates its guidelines.
The video creation platform has been experimenting with the new system over the past few months to give developers more clarity on why their content has been removed.
TikTok said in a blog post that explaining its enforcement actions and reminding users of its policies reduced the incidence of repeat offenses and nearly tripled visits to the community guidelines page.
The platform also saw a 14% decrease in requests from users to object to video removal.
Under the new system rolled out worldwide this week, when a video is removed, the creator can see exactly which policy has been violated and appeal the decision.
In cases where content is flagged as self-harming or suicidal, a second notification directs the creator of that content to expert resources.
TikTok wrote, “Transparency with our community is key to continuing to gain and maintain trust. We are excited to make this new notification system available to all of our users, and we will continue to work to improve the way we help our community understand our policies while we continue to build a safe and supportive platform . "