TikTok, the short-form video platform with a mission to inspire creativity and bring joy, is reaffirming a commitment to keeping Cambodian TikTok users safe online. Measures range from constant updates and analysis, to safety procedures and guidelines.
An essential role of the TikTok team is to identify and analyse content to continuously improve the accuracy of the content moderation processes.
This applies to Cambodian content as it does for content from around the world.
For example, if a video is becoming popular, it may be reviewed again by TikTok’s systems to reduce the potential that violating content remains on TikTok.
TikTok also analyses content moderation decisions to understand why violating content may not have been caught at an earlier stage, and to identify trends of violating content on the platform.
For example, TikTok might learn that more work is needed to develop technology to automatically detect certain types of potential violations.
On other occasions, TikTok might find it necessary to facilitate additional targeted training sessions for TikTok’s moderation teams to help drive a better understanding of certain policies and nuances, with the aim of improving correct decision-making during the review process.
“At TikTok, thousands of people are focused on helping to make our platform safe for our community to explore entertaining content and share their creativity. The Trust and Safety team at TikTok is focused on carrying out a variety of tasks to protect our community.
“As we continue on our safety journey, we want to be open and transparent along the way, and we’ll talk through some of this work here,” said Kyu Kyu Thein, public policy manager, Cambodia at TikTok.
As TikTok explains in a latest Transparency Report, 87.5 per cent of violating videos were removed before they received a single view, and TikTok is committed to continuing to develop effectiveness in this area.
While technology can help to remove clear-cut violations, an important part of content moderation involves human review.
No matter the time of day, if content is reported, TikTok’s teams are on standby to take action.
Through this additional layer of human review, TikTok can also improve the platform’s machine learning systems as moderators provide feedback to the technology, helping to capture emerging content trends and improve future detection capabilities.
Product and Process teams are focused on designing strategies and techniques to more efficiently detect potential harms and enforce TikTok Community Guidelines at scale.
TikTok uses ‘hashing’ technology to create a unique digital identifier of an image or video. In line with industry standards, this enables TikTok to mark a known harmful piece of content as violating and more easily remove it at scale.
For example, if content is removed for breaking TikTok’s policies that protect against the sharing of child sexual exploitation images, the unique identifier would help to find and remove matching content, and prevent repeated uploads.
TikTok Community Guidelines
TikTok Community Guidelines define a set of norms and common code of conduct for TikTok – they provide guidance on what is and is not allowed to help maintain a welcoming space.
TikTok’s policy experts are responsible for constantly assessing these guidelines to consider how TikTok can enable creative expression while protecting against potential harms.
These experts are often subject matter specialists – some have expertise working in the technology industry, while others may join the team from civil society or government.
Their roles involve grappling with complex and challenging areas – for example, where should we draw the line on content related to eating disorders.
“From our work with outside experts, we know that eating disorder-related content can be damaging, but crucially, content that focuses on recovery can have a positive impact, and as we refine our policies it’s important that we continue to reflect nuances such as this,” said Kyu Kyu.
Focus on employee wellbeing
Building and maintaining a safe experience for the TikTok Community is the TikTok team’s most important role.
At times, this means moderators may be required to review potentially harmful content, which makes providing the right support essential.
TikTok recognises this and is focused on prioritising the health, safety, and wellbeing of the TikTok team too.
TikTok provides the teams with access to wellbeing programmes, including training, evidence-based resources and professional counselling.
In addition, TikTok conducts regular analysis to understand how the platform can continue to improve, and TikTok hopes to lead the industry by providing the most ambitious and effective support structures for TikTok’s team members.
“As we continue on our journey to help make TikTok a safe place where joy and creativity can thrive, we’re looking forward to sharing more about the work of our Trust and Safety teams,” said Kyu Kyu.