Content moderation in challenging times

Content moderation in challenging times

Presented by Two Hat Security


Player behavior can make or break your game and foster positive behaviors within a community of players is an essential part of growing your brand and retaining online gamers. To solve this problem, game developers need to step back and critically examine the root causes of negative behavior.

This means that developers have to do intentional design decisions, providing a safe and welcoming environment, reminding users and players of community standards, using personalization and amplifying the social bonding of social interactions in games.

And with millions of Americans trapped in the middle of the COVID-19 pandemic, the goal of influencing these positive behaviors of online gamers is more important than ever. School and occupation closings, as well as strict containment measures, mean more and more people rely on digital technology and solutions for entertainment, information and connectivity – and video games have become the solution perfect for many.

Social games and virtual worlds bridge the gap by providing the experiences and interactivity that people around the world currently dream of. So how can game developers continue to encourage these positive behaviors as traffic continues to increase? This influx of activity is not expected to threaten positive gaming experiences for games, and it is up to game developers to continue improving their moderation skills.

Chat volumes are up significantly

Between January 3 and April 7, 2020, the chat between cross-platform games, mobile games, children’s platforms, teen social networks and virtual worlds increased dramatically from week to week . In fact, some Hat two customers recorded a 40%, 100% and even 3,000% increase in chat, comparing March and February.

During these times, there may also be an increase in negative bullying discussions and even material for child sexual abuse (CSAM) or grooming. You can imagine the friction and the effort that this causes a moderation team. If there is a small team responsible for moderation, their workflow doubles or triples almost overnight.

Moderation techniques are necessary to manage the increase in volumes

To manage these increased volumes of content, game developers face a number of challenges. Human moderation teams can only handle such an amount and can easily miss negative content on their gaming sites or platforms. The following techniques for managing these increased volumes will help your team better manage workloads, reduce the amount of manual work required and prioritize negative content.

Reduce manual moderation

First and foremost, it is important to reduce your use of manual moderation. Developers can do this by popping up community guidelines as part of the experience each time the user logs in. By providing a single mandatory button, the user must click and accept the instructions before chatting in the community. You can also implement warning messages whenever the system detects that a user is trying to post content that violates your community guidelines (such as harassment or dangerous / hate speech). And using messaging to reiterate warnings that users who submit false reports themselves may be subject to sanctions will reduce the number of false claims that your team must investigate.

Sensitive content must be escalated

During this crisis, users have a wide range of negative life experiences. In many cases, users may feel the need to express themselves and their feelings through your platform, but it is essential to strike a balance between security and expression. Watch for threats of self-harm or other online harm suggested in the community.

Filter settings

These should also be reviewed for prior moderation. Some game companies review a lot of user-generated content before it goes online. However, in difficult times like these, your team may not have the ability to review too much content manually, so be sure to prioritize these filters and ask if there are elements of content that can be reviewed after publication (publication-moderation) to distribute the workload more evenly.

Implement effective sanctions

Finally, once you have reduced manual moderation through proactive filters and created escalation queues for content that requires timely review, you can apply effective sanctions to establish clear consequences for repeated negative behaviors. Make sure to apply sanctions that will happen quickly, with a progression flow similar to this:

Without consequences, users can continue to abuse both the system and other players. Don’t give users unlimited opportunities to break your community rules

While it is important to stay connected in these uncertain times, it is essential that moderation standards are in place to ensure positive gaming experiences for your users. As the world begins to recover from this pandemic and the gaming industry continues to grow, people will continue to seek new ways to interact with the changing world around them. Someday we will return to a new normal and this pandemic will set the standard for the years to come. But in the meantime, it is our responsibility to protect our online gaming communities.

For more information, please download the full Two Hat eBook, Moderation of content in difficult times.

Carlos Figueiredo is Director, Community Trust & Safety at Two Hat Security.


Sponsored articles are content produced by a company that pays for publication or has a business relationship with VentureBeat, and they are always clearly marked. The content produced by our editorial team is in no way influenced by advertisers or sponsors. for more information, contact sales@venturebeat.com.

Leave a Comment

Your email address will not be published. Required fields are marked *