TikTok to penalize creators who repeatedly share 'problematic' content
In a move to create a safer environment for its users, especially teenagers, TikTok is increasing penalties for creators who disseminate "problematic" content. The social media behemoth is also modifying its guidelines on what can be recommended within the app. This action forms part of TikTok's strategy to alleviate concerns from US lawmakers/regulators regarding the platform's safety. The updated community guidelines, due in May, will include an extensive list of content ineligible for recommendation in the app's "For You" feed.
New rules to exclude certain content types
The revised guidelines will exclude a variety of content from the "For You" feed, including sexually suggestive or violent material, videos featuring "dangerous activity and challenges," and a wide array of weight loss or dieting content. Additionally, clips from users under 16 years old will be barred from appearing in the feed. The new rules are also designed to counter misinformation and conspiracy theories, specifically unverified claims related to emergencies or unfolding events, and potential high-risk misinformation awaiting fact-checking review.
Penalties for guideline violations
TikTok has warned that creators who consistently ignore these guidelines will face penalties, such as making their entire account ineligible for recommendations or reducing their account's visibility in searches. A new feature called "account status" will alert users about strikes on their account and posts that violate the app's rules. Another add-on, called "account check," will allow users to see if they are currently blocked from recommendations or unable to use features like messaging or commenting due to rule violations.