Exclusive: TikTok introducing more automation to video removals – Axios

TikTok is rolling out a new system that will allow the company to block videos that violate its policies automatically when they're uploaded. The social network is also changing the way it will notify users when their content is removed.

Why it matters: TikTok says the new system will not only improve the user experience, but will help reduce the number of distressing videos (such as those with violent content) that its safety team must review, freeing staff to focus on more nuanced content areas, like hate speech, bullying and harassment.

Details: Beginning this week, TikTok will test the automatic deletion of several content categories that violates its policies, like minor safety, adult nudity and sexual activities, violent and graphic content, illegal activities and regulated goods.

Be smart: TikTok's safety team has always removed content its technology screened as a violation of its rules, but these changes will bring more automation to the process, making its moderation efforts more efficient. Its safety team will continue to review reports and content removal appeals from users.

The big picture: The changes are part of a wider company effort to be more transparent about the way TikTok moderates content.

What's next: TikTok said that as a part of Friday's update, it will also change the way it notifies users when they violate the Community Guidelines.

Yes, but: TikTok acknowledges that its tech isn't perfect and it may inadvertently remove someone's video that doesn't violate its terms. In that scenario, TikTok says the content will be reinstated and the penalty will be erased from that user's record.

Read this article:

Exclusive: TikTok introducing more automation to video removals - Axios

Related Post

Comments are closed.