TikTok is making a bold move to put more responsibility on creators when it comes to following its community guidelines. In a set of updates rolling out on May 17th, the short-form video platform is introducing new tools that will allow creators to audit their own content and face consequences if they break too many rules.
The biggest addition is the “Account Check” feature that will let creators review their last 30 uploaded videos to see if any of them violate TikTok’s policies. It’s essentially user-initiated content moderation, giving creators a way to get ahead of any strikes or bans for breaching guidelines.
But if creators repeatedly fail those self-audits by uploading crummy videos, they’ll be punished. Accounts that rack up too many violations will temporarily lose the ability to have their videos recommended on the For You feed, TikTok’s endless stream of content served up by its powerful algorithm.
TikTok claims this new system of putting creators in the moderator seat will “foster transparency and encourage compliance.” It’s also implementing a new strike policy for first-time offenders, who will get a warning instead of an immediate ban or other penalty. That warning will include details on what content was problematic and why — and give creators a chance to appeal if they feel TikTok’s moderators got it wrong.
The company is leaving no room for leniency when it comes to “any incitement to violence,” however. In those cases, TikTok says it will maintain a “zero-tolerance policy” regardless of circumstances.
Along with the new guidelines, TikTok is also introducing a “Creator Code of Conduct” that will apply to creators when they participate in TikTok programs, partner campaigns, events and the like. It’s not entirely clear what’s contained in that code beyond TikTok saying it will “establish clear expectations” for creator behavior.
The updates show TikTok is getting more serious about content moderation at a time when lawmakers and politicians have put massive pressure on social media platforms to better police what’s posted. But putting some of that enforcement responsibility on creators themselves is a risky move that could lead to confusion, mistakes or general chaos if not implemented very carefully.
Featured image by Eyestetix Studio on Unsplash