Social Media

TikTok to inform users why it removed their video

TikTok - Appy Pie

TikTok is stepping up its content moderation on the platform with a new feature. The short-video- making app announced that it will give users at least a vague idea of why their video has been removed, by stating the exact policy it has violated. This is exactly what other social media platforms do.

In most cases, users would just get a slightly vague message about their post having violated the app’s community guidelines, without a precise explanation of what was wrong with the content uploaded.

So far, TikTok would just inform users that the video had violated community guidelines on the platform and that users were welcome to do what they wished with that information. However, users had the option to appeal.

The solely actual change in TikTok’s new content material violation notifications is to identify the particular coverage.

TikTok informed that it has been experimenting with these notifications for many months, and that appeals have really gone down by 14 percent.

This was planned for July this year when the company released its second transparency report and had mentioned in it that the platform had started keeping track of the specific reasons why it removes each video.

Similar to the older rule, users on the platform will still be able to submit an appeal against the deletion, opening up the possibility that their video could be reinstated. Due to this, they will have some idea of what they are appealing now.

TikTok said, “Our goals are to enhance the transparency and education around our Community Guidelines to reduce misunderstandings about content on our platform, and the results have been promising.”

TikTok says it’s also improving how reporting is handled. When a user’s video is deleted, they should be giving provide not only the explanation that the community guidelines were violated but point towards the guideline that was the issue in particular. This will help people to get not only information about the platform when they need it, but also find outside help if the content they’ve seen has affected them.

TikTok added, “We are also working to support our community via these notifications. For instance, when content is flagged as self-harm or suicide-related, we will provide access to expert resources via a second notification.”

Let’s see how TikTok users react to the new content moderation feature.