Our own Nee Nguyen, Global Head of Community Management, recently joined a panel called “Dealing with Toxicity: Creating the Right Environment for Players” at the United By Games event in London on October 5. While there is no one simple solution, Nee and her panel colleagues outlined their methods for combatting toxicity and their ongoing efforts to reward positivity.
The first step toward defeating toxicity is to try to head it off before it becomes a problem. Preparation is everything prior to launch, to ensure players are protected and supported. It’s vital to delineate what two-way discourse looks like between the developers and the community team. How will they report feedback? How does that feed into the dev team’s work? Decide early on what toxicity means for the client and manage expectations with both the community and dev teams. Lock in the times for expected activity and be transparent about processes, externally as well as internally, where possible.
Once it comes to fighting active toxicity, it’s important to know which tools are available and how to use them—tools like anti-cheat software and perhaps even more importantly, identity verification. A lot of toxicity comes from anonymity and the sense of security it can give. But, when people opt in and verify, anonymity is removed, and toxicity can be reduced. And remember, there’s an opportunity here for machine learning from identity attacks.
In the reporting tools, set triggers for a baseline number of issues and bugs to assess. Be clear with players about what’s being fixed and what isn’t, to help manage expectations and ease frustration points. Always keep in mind that while the general goal is to protect the game playing community, it’s the moderation teams who are on the front lines that absorb the toxicity, even as they’re fighting to fix it. Support these teams the same way the larger community is supported, with care and listening.
An interesting takeaway from the panel is the relationship between toxicity and game sales. If a game attracts toxicity and the support team is seen as unable to counteract it, this can hurt future games by the same team. In the way that negative sentiment can hurt a publisher’s sales across the board, it’s important for studios to take toxicity seriously, as it directly affects a player base. Even if a future game is of a different genre, the company’s reputation between projects can potentially carry over.
Regarding identifying which kinds of games produce toxicity, it’s noted that PVP environments can get more toxic. The online head-to-head competition can breed a level of competition and aggressiveness that can frequently become personal. MMOs also seem to attract bad behaviors, but some do try to foster collaboration, like EVE Online or World of Warcraft. Even some single player games have histories that induce an element of mistrust that can sour into outright toxicity. But, it must be acknowledged that any game with an element of communication and direct player-to-player engagement has the potential for both good and bad behavior.
So, the question arises: if it isn’t
enough to punish bad behavior, how do we encourage good behavior?
Promoting positive play can be difficult, since many times it depends on player reportage. It’s much easier to track down and identify poor behavior. Who are the good actors and how do we promote them as role models?
The key is to devise systems and initiatives that not only reward positive play, but also make praiseworthy examples of them to the community. This can be a subtle method of reforming toxic players by simply showing them the benefits of positivity and good behavior. But, if the in-game mechanics don’t allow for this kind of promotion, then it has to happen outside of the game, in forums, streaming, community sites, events, etc.
Developing player community evangelist roles can also be a way to champion players who deserve notice. Granting them moderation powers in forums is another method that’s both highly visible to the player base and takes some of the pressure off the official moderation team at the same time. Transparency is the key to modeling the power of positive play.
Championing content from the community is another great way to support positive players. Talk directly to them over official channels for visibility. This also puts a face behind the brand, which humanizes companies and makes players feel included. If possible, facilitate face-to-face time or conversation time with lead devs on teams and one of your MVPs or VIPs in your community. Some personal attention can really energize a community and make them feel part of the game. This, in turn, creates positive sentiment and reduces negativity. For a more practical result, develop a trust score and attach rewards to good behavior, like premium memberships for free.
It’s not enough to simply ban players or take things away to punish bad behavior. To truly combat toxicity, good behavior must also be rewarded and communication needs to be clear and consistent. These are necessary to develop the kind of inclusive and welcoming community that persists over the course of a game’s life, and even across multiple titles by the same studio or publisher. It begins with the proper planning to develop systems and tools of moderation, and continues with an alert, aware and compassionate community team providing the highest value player support.
Photos by Aaron Lee