TikTok updated its community guidelines to promote the safety, security, and well-being of the users of the popular Chinese video-sharing app. The changes address the concerns raised by U.S. senators during their inquiries into TikTok’s business practices, including the prevalence of eating disorder content and dangerous hoaxes on the app.
TikTok’s updated community guidelines
TikTok is taking a stricter approach to dangerous acts and challenges such as suicide hoaxes. A dedicated section in the community guidelines will highlight the issue so it will be easier for people to find. TikTok is also launching videos that encourage people to stop, think, decide and act whenever they see risky online challenges.
TikTok says it’s developing a system to identify and restrict certain types of content from being accessed by teens. TikTok has laid a roadmap to address serious issues such as LGBTQ, minor safety, deadnaming, and misogyny. The company is explicitly banning misgendering and deadnaming – using a transgender person’s former name without that individual’s consent.
TikTok removed 91 million videos
In September, TikTok reported that around one billion people were using the app each month. It said that 91 million videos were removed during the third quarter of 2021, those were not abiding by the community guidelines. From them, 88% were caught before being watched by viewers. TikTok and other social media platforms have faced criticism for years for harboring harmful content among teens and younger audiences.
TikTok said that it is expanding its system that detects and removes videos that include adult nudity, illegal activities, or anything that risks the safety of minors. WakeMed psychiatrist Dr. Nerissa Price said, “Social-media companies have a responsibility to try to ensure the safety of their users, especially our young people that are much more susceptible. But, my caution would be that parents still need to be vigilant.”
Source: Tech Crunch