TikTok updates Community Guidelines to promote safety, security, & well-being on the platform

TikTok updates Community Guidelines to promote safety, security, & well-being on the platform

TikTok updated its Community Guidelines to further support the well-being of its community and the integrity of the platform. The updates clarify or expand upon the types of behavior and content that will be removed from the platform or make ineligible for recommendation in the For You feed.

[By Cormac Keenan, Head of Trust and Safety] :

Some of the main updates announced and to be implemented include:

  • Strengthening our dangerous acts and challenges policy. We continue to enact the stricter approach we previously announced to help prevent such content - including suicide hoaxes - from spreading on our platform. This previously sat within our suicide and self-harm policies, but will now be highlighted in a separate policy category with more detail so it's even easier for our community to familiarise themselves with these guidelines. As part of our ongoing work to help our community understand online challenges and stay safe while having fun, we've worked with experts to launch new videos from creators like @caitlinandleahh and @maddylucydann that call on our community to follow four helpful steps when assessing content online - stop, think, decide and act. Community members can also view these videos at our #SaferTogether hub on the Discover page over the next week.
  • Broadening our approach to eating disorders. While we already remove content that promotes eating disorders, we'll start to also remove the promotion of disordered eating. We're making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behaviour without having an eating disorder diagnosis. Our aim is to acknowledge more symptoms, such as overexercise or short-term fasting, that are frequently under-recognised signs of a potential problem. This is an incredibly nuanced area that's difficult to consistently get right, and we're working to train our teams to remain alert to a broader scope of content.
  • Adding clarity on the types of hateful ideologies prohibited on our platform. This includes deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programmes. Though these ideologies have long been prohibited on TikTok, we've heard from creators and civil society organisations that it's important to be explicit in our Community Guidelines. On top of this, we hope our recent feature enabling people to add their pronouns will encourage respectful and inclusive dialogue on our platform.
  • Expanding our policy to protect the security, integrity, availability, and reliability of our platform. This includes prohibiting unauthorised access to TikTok, as well as TikTok content, accounts, systems, or data, and prohibiting the use of TikTok to perpetrate criminal activity. In addition to educating our community on ways to spot, avoid, and report suspicious activity, we're opening state-of-the-art cyber incident monitoring and investigative response centres in Washington DC, Dublin, and Singapore this year. TikTok's Fusion Centre operations enable follow-the-sun threat monitoring and intelligence gathering, as we continue working with industry-leading experts to test and enhance our defences.

Additionally, our community can find more information about the content categories ineligible for recommendation into For You feeds. While the ability to discover new ideas, creators, and interests is part of what makes our platform unique, content in someone’s For You feed may come from a creator they haven’t chosen to follow or relate to an interest they haven’t previously engaged with. That’s why when we come across content that may not be appropriate for a general audience, which includes everyone from teens to great-great-grandparents, we do our best to remove it from our recommendation system.

Staying accountable to our community

The strength of a policy lies in its enforceability. Our Community Guidelines apply to everyone and all content on TikTok, and we strive to be consistent and equitable in our enforcement. We use a combination of technology and people to identify and remove violations of our Community Guidelines, and we will continue training our automated systems and safety teams to uphold our policies.

To hold ourselves accountable to our community, NGOs, and others, we release Community Guidelines Enforcement Reports quarterly. Our most recent report, published today, shows that over 91 million violative videos were removed during Q3 2021, which is around 1% of all videos uploaded. Of those videos, 95% were removed before a user reported it, 88% before the video received any views, and 93% within 24 hours of being posted. We continue to expand our system that detects and removes certain categories of violations at upload – including adult nudity and sexual activities, minor safety, and illegal activities and regulated goods. As a result, the volume of automated removals has increased, which improves the overall safety of our platform and enables our team to focus more time on reviewing contextual or nuanced content, such as hate speech, bullying and harassment, and misinformation.

News Source: TikTok Newsroom

- Advertisement -
Dark Light