TikTok has announced its refreshed Community Guidelines with additional details about what’s allowed on the platform.
The Community Guidelines support the authentic and entertaining TikTok experience that people know and enjoy. They define a common code of conduct and encourage a welcoming community environment. TikTok continually reviews and strengthens its policies to help users feel comfortable and safe to create and share.
At TikTok, safety isn’t a nice-to-have or an afterthought; it’s central to all the app’s work, and the team strives to be inclusive and thoughtful when developing policies. TikTok’s guidelines apply to everyone and all, the platform’s content.
These guidelines broadly cover 10 categories of content, and the latest update adds more specifics to each area based on behaviour seen on the platform, feedback heard from members of the TikTok community, and input from academics and civil society organisations.
While much of this content was covered by TikTok’s previous guidelines, today TikTok is highlighting some of the key areas that it has strengthened to better support the well-being of its community.
TikTok wants its community to feel comfortable and confident when expressing themselves exactly as they are. The updated guidelines incorporate feedback and language used by mental health experts to improve the policies on self-harm and suicide and avoid normalising self-injury behaviours. TikTok’s policy on eating disorder content has additional considerations to prohibit normalising or glorifying dangerous weight-loss behaviours.
TikTok recognises the burden victims of abuse often face in managing their online presence. TikTok has bolstered its policies on bullying and harassment, and its guidelines are now more explicit about the types of content and behaviours that aren’t welcome on the platform, including doxxing, cyberstalking, and a more extensive policy against sexual harassment.
The safety of everyone in TikTok’s community is of utmost importance, especially the well-being of youth. In line with its dangerous acts policy, TikTok has always taken steps to either limit, label, or remove content that depicts dangerous acts or challenges. Now, TikTok added a harmful activities section to its minor safety policy to reiterate that content promoting dangerous dares, games, and other acts that may jeopardise the safety of youth is not allowed on the platform. TikTok encourages people to be creative and have fun, but not at the expense of an individual’s safety, or the safety of others.
TikTok stands firmly against violence, both online and off. TikTok has updated its previous dangerous individuals and organisations policy to focus more holistically on the issue of violent extremism. TikTok’s guidelines now describe in greater detail what’s considered a threat or incitement to violence and the content and behaviour it prohibits.
As TikTok develops inclusive policies, it continually works to make the platform more accessible for everyone. TikTok recently announced new tools to support people with photosensitive epilepsy, and it is starting to roll out a text-to-speech feature that allows people to convert typed text to voice that plays over text as it appears in a video.
As TikTok navigates challenging subjects like self-harm, compassion for survivors is front of mind. Over the coming weeks, TikTok will roll out updated resources to support people who may be struggling. These resources were created with guidance from leading behavioural psychologists and suicide prevention experts, including Providence, Samaritans of Singapore and members of the US Content Advisory Council. Now, if some one searches for terms like “selfharm” or “hatemyself” they’ll see evidence-based actions they can take.
READ ALSO: Delta community Invasion: Attack based on misinformation, I didn’t kidnap anybody — Suspect
TikTok is also introducing opt-in viewing screens on top of videos that some may find graphic or distressing. These types of videos are already ineligible for recommendation into anyone’s For You feed, and this feature aims to further reduce unexpected viewing of such content by offering viewers the choice to skip the video or watch it.
TikTok continues to develop tools to help people manage their app experience, from automatically filtering unwanted comments to the ability to say “not interested” on videos in their For You feed. This is especially important in TikTok’s efforts to support people who want to share their stories and use their voices to raise awareness on topics others may find triggering.
Since the start of the pandemic, TikTok has provided access to public health information from experts in-app and relief for frontline workers and families. TikTok is proud when its community comes together through memorable challenges – like #Bopdaddychallenge and #DontRushChallenge – that connect everyone and bring joy during difficult times. As COVID-19 vaccines are developed and approved, TikTok is furthering its efforts to support the well-being of its community by making authoritative information about vaccines readily available. At the same time, TikTok continues to remove misinformation about the coronavirus and vaccinations.
Over the coming week, the TikTok in-app coronavirus resource hub will be updated with commonly asked questions and answers about COVID-19 vaccines from the World Health Organization and Center for Disease Control. TikTok’s coronavirus resource hub is accessible from the Discover page, search results, and banners on COVID-19 and vaccine-related videos which have been viewed over 2 billion times globally over the last six months. TikTok is also partnering with Team Halo so that scientists all over the world can share the progress being made on the vaccine through video updates.
TikTok works to educate and empower its community on the policies through in-app videos, notifications, and safety tools. This month when a user opens up TikTok, they’ll be prompted to review these refreshed guidelines.
Keeping the community safe is a commitment with no finish line. TikTok recognises the responsibility it has to its community to be nimble in its detection and response when new kinds of content and behaviours emerge. To that end, TikTok keeps advancing its policies, developing technology to automatically detect violative content, building features that help people manage their TikTok presence and content choices, and empowering its community to help foster a trustworthy environment. Ultimately, TikTok hopes these updates enable people to have a meaningful and positive TikTok experience.