Experienced game developers have a wealth of knowledge when it comes to crafting engaging gameplay, immersive worlds, and captivating narratives. As online communities become an integral part of the gaming experience, the management and protection of those communities must be handled with the same level of sophistication.
- Investing in trust and safety (T&S) measures leads to long-term increases in player and community wellness, engagement, retention, and even monetization. Integrating T&S principles into the game’s design from the outset is key.
- Enacting T&S means communicating, collaborating, and educating across teams, including product, engineering, UX, marketing, and C-Suite. This alignment is crucial to the success of safety across the platform.
- Effective T&S involves a multi-faceted approach, including risk assessment, detection and enforcement, prevention, clear policies, and player education.
- The gaming world faces a wide array of risks, from player harassment to game integrity issues, each requiring tailored strategies for mitigation.
- T&S is complex, but challenges like scale, evolving behaviors, and resource limitations can be overcome with a combination of technology, human moderation, and a proactive approach.
- Partnering with experts, educating players about online safety, and fostering a positive community culture are all vital components of a successful T&S strategy.
Understanding the fundamentals of trust and safety
In gaming, trust and safety (T&S) refers to the proactive measures taken to protect and empower players in order to ensure a secure environment where everyone can enjoy the game, its community, and related services. This includes minimizing risks, such as harassment, fraud, and other harmful behaviors, and ultimately promoting a healthy and enjoyable user experience.
Thriving online communities are built on a foundation of trust, where shared values and expectations foster a sense of belonging and open engagement. Safety measures, crucial for protecting against harmful behaviors and content, reinforce this trust. Ultimately, this creates an environment where players feel valued, respected, and empowered. In short, investing in T&S initiatives is not merely about risk mitigation — it’s an investment in the community itself.
A robust T&S framework and strategy can:
- Reduce player and community harm — Proactive T&S measures can help prevent and address harmful behaviors, leading to a safer and more resilient community.
- Increase engagement and retention — Across nearly every conceivable genre, studies reliably find that players engage more when they feel safe and supported.
- Support a healthy brand — An explicit commitment to player well-being can enhance both a game’s and a studio’s reputation, which in turn attracts new players and increases the diversity of the playerbase.
- Improve monetization — Multiple companies have found through internal research that fostering a sense of community and belonging among players directly correlates with increased player investment in their games.
Typically, there are five core components to T&S work:
- Policy and regulation — Establishing clear guidelines around what sort of behaviors you want within your community, and then communicating those principles internally and externally.
- Risk assessment — Identifying and understanding the specific risks your game and community face.
- Prevention and shielding — Taking proactive measures to prevent harm and build community resilience.
- Detection and enforcement — Implementing systems and processes to identify and address harmful behavior.
- Education and advocacy — Actively raising awareness and fostering understanding of T&S issues across all stakeholders, including players, staff, and regulators, while championing the importance of this work.
An overview of risks in gaming
In gaming, T&S teams and specialists focus on mitigating a wide range of risks, aiming to protect both the players and the game itself. These risks can be categorized into several key areas:
- Player safety and wellness, including:
- Disruptive behavior
- Harassment and bullying
- Harmful content
- Account security
- Cheating and exploits
- Community health, including:
- Misinformation and disinformation
- Radicalization, extremism, and dangerous speech
- Child exploitation and harm
- Legal and regulatory compliance, including:
- Data privacy
- Consumer protection
- Age-appropriate content
- Age verification
- Game integrity, including:
- Fraud and scams
- Game economy disruption
- Intellectual property infringement
- Operational risks, including:
- DDoS (distributed denial of service) attacks
- Technical vulnerabilities
- Content moderation backlogs

Elements of a healthy community
T&S teams and specialists also focus on creating, enhancing, and preserving a number of community elements in order to give players the best possible experience:
- Opportunities for player connection, including:
- Communications options that encourage supportive and friendly behavior.
- Opportunities for creative expression (e.g., user-generated content) and ways for like-minded players to appreciate each others’ creativity.
- Community structure, including:
- Clear guidance around what norms / behaviors can be expected.
- If necessary, account management to ensure only appropriate individuals (e.g., age gating) are able to participate in higher-risk activities.
- Responsive reporting options for concerned players to raise issues and know they are being heard.
- Respect for player autonomy and rights, including:
- Data privacy
- Parental controls
- Fair and balanced competitive experience, including:
- Gameplay balance
- Detection and prevention of cheating or disruptive play
- Sensibly managed game economy
- Skill-weighted matchmaking and team-building
- Reliability, including:
- Robust and secure architecture to ensure consistent uptime
- Crossplay and feature consistency
What makes T&S hard?
Creating a safe and trustworthy online environment is a complex undertaking. Developers face a multitude of challenges that can make it difficult to determine the most effective strategies, and concerns about the return on investment (ROI) of trust and safety initiatives can often deter necessary spending. In particular, developers face:
Scale and complexity challenges
- Large player bases — The volume of interactions and the diversity of player wants and needs makes it difficult to monitor everything and respond to harmful behavior effectively.
- Lack of social consequences — The perceived lack of social consequences in online environments can embolden individuals to engage in harmful behavior they might avoid in face-to-face interactions.
- Anonymity — While not typically a root cause, anonymity can amplify negative behavior due to the decreased fear of repercussions or identification.
- Evolving behavior — The constant evolution of online trends, exploits, and forms of harassment demands ongoing vigilance and adaptability in trust and safety measures.
Knowledge and understanding gaps
- Lack of understanding of root causes — The underlying reasons for harmful behavior are complex and not always clear, making it difficult to develop effective solutions.
- Balancing freedom and safety — Balancing the freedom of self-expression with the need to protect players from harm is a nuanced challenge. For example, a term used to reclaim personal identity can be difficult to distinguish from the same term used as a harmful slur.
- Bias to wait-and-see — It’s a common mistake to delay investing in T&S until a community is established, profitable, or facing visible problems. A proactive approach throughout development, however, creates a scalable foundation for resilient communities that are better equipped to withstand the inevitable challenges that arise in any online space.
Resource and implementation challenges
- Limited resources — Smaller studios, often constrained by limited budgets and staff, may mistakenly believe that comprehensive T&S initiatives are only feasible for larger companies, as tailored resources are scarce. This can result in a lack of prioritization and implementation of these critical measures.
- Low buy-in — Without a comprehensive T&S strategy integrated early in development, teams may struggle to balance competing priorities pre-launch. Delaying or underfunding T&S investments can lead to superficial solutions or entrenched harmful norms in live communities, whereas establishing standards from the outset can mitigate these issues.
- Technical limitations — Game codebases may not always have the necessary tools or features to effectively address T&S concerns, especially if not considered as part of early planning.
Business and community challenges
- Fear of backlash — Concerns about negative community backlash and potential revenue loss can deter companies from implementing stronger T&S measures. This hesitancy is further compounded by a reluctance to openly share their experiences and lessons learned with other studios.
- Entrenched norms — Transforming the culture of established communities where harmful behaviors have taken root requires a multi-faceted approach, including persistent education to reshape norms (including developer norms) and consistent enforcement to deter negative actions.
- T&S as cost center — While seemingly practical, structuring T&S as a cost center can isolate the team, hindering collaboration and input on critical decisions. This siloed approach risks increased costs and missed opportunities to proactively address issues, ultimately impacting your bottom line.
Effective trust and safety in gaming
Effective T&S in gaming requires a multi-faceted strategy. Accurate detection of harmful behavior, achieved through a combination of advanced technology like AI and human moderation, is paramount. Enforcement should go beyond punishment, however, and focus on fostering positive change, such as implementing restorative justice practices to reduce repeat offenses and cultivate a healthier community.
Prevention is equally vital. Proactive measures, like comprehensive community guidelines, accessible reporting tools, proactive moderation, and nurturing a healthy and resilient community by design, create an environment less conducive to harmful behavior.
Underpinning these efforts are clear, adaptable policies that strike a balance between player autonomy and safety while adhering to internal company values and external regulations like data protection and age restrictions.
Understanding the root causes of harmful behavior is key to developing effective interventions and preventative measures. This involves recognizing potential biases in player feedback and moving beyond simplistic solutions to address underlying issues. A holistic approach, such as the ECE (Environment, Consequences, and Encouragement) Framework, offers a comprehensive strategy that addresses the key aspects of creating a positive and safe gaming environment.
Ultimately, each T&S decision, including content moderation, reflects a company’s values and shapes the platform’s atmosphere and the types of users it attracts. By studying the underlying issues, tailoring our efforts, and focusing on long-term, sustainable change, we can foster thriving online communities where every player feels safe, respected, and empowered.
What does good look like?
Assessing your T&S efforts is a multi-faceted endeavor that necessitates a comprehensive strategy and assessment framework. Many of the success and warning signs mirror the signs for trust. In addition to those, here are some additional examples of what you might see:
| Success signs | Warning signs | |
| Individual | Awareness of impact on others More resilient High satisfaction Utilizing T&S tools like block or mute | Exploiting loopholes and vulnerabilities Low resilience Voicing concerns about safety and security |
| Group | Upholding a healthy social contract Self-moderation Supporting victims of harassment Show initiative in promoting and protecting openness | Easily disrupted by problematic players Low group self-management High individuality among groups Insular groups (welcoming only certain identities or ideologies) |
| Community | Low rates of harassment and disruption Active engagement Positive community reputation Positive studio reputation General sentiment Low recidivism High creative output High representation from diverse identity groups* | High proportion of vulnerable players High incidence of exploitation Low community self-management High presence of insular groups Signs that malicious actors favor your game for their activities (extremist groups, child groomers) Low diversity Reports of unsafe environment for certain identity groups |
For a more in-depth look at measurement, see Introduction to Measuring Community Health.
A rising tide
Greater industry transparency, both publicly and internally, is crucial for addressing trust-and-safety challenges in gaming. The goal is not to single out specific developers or games, but to encourage collective accountability within the industry and society at large. By openly sharing our progress and lessons learned, we empower all developers to tackle these issues and normalize the expectation of healthy online communities.
Increased awareness can also help dispel misconceptions about gaming itself. These problems are not inherent to gaming, but rather they reflect broader societal issues that manifest in online spaces where people gather. While bad actors can exploit the strong social connections fostered in gaming, these bonds themselves are not inherently negative. Therefore, we need to reimagine gaming as a form of social infrastructure, working collaboratively to develop better practices and digital tools that support thriving online communities and protect the vulnerable.
Gaming companies, with their understanding of play and the power to design intentional spaces, are uniquely positioned to lead this charge and drive meaningful change. By prioritizing trust and safety, we not only create better experiences for players and society, but also ensure the long-term viability and success of our businesses.
Acknowledgments
With special thanks to Mike Pappas (CEO / Co-Founder, Modulate), Matt Soeth (Executive Director, TIGG), and Laura Higgins (Senior Director of Community Safety & Civility, Roblox) for their invaluable contributions to this article.
Now what?
For a list of key organizations supporting T&S in gaming see Organizations Supporting Trust and Safety in Gaming.
Also check out Ben Whitelaw’s Everything in Moderation.
See all articles on Trust and Safety.
References
- Cushing, D. F. & Miller, E. (2020). Creating Great Places: Evidence-based Urban Design for Health and Wellbeing.
- Fair Play Alliance / Thriving in Games Group and Anti-Defamation League. (2020). Disruption and Harms in Online Gaming Framework.
- Greitens, E. (2015). Resilience: Hard-Won Wisdom for Living a Better Life.
- Hunsberger, A. (2024). Trust & Safety is how platforms put values into action. Everything in Moderation.
- Kilmer, E. & Kowert, R. (2023). Radicalization in games: A primer. Take This.
- Suler, J. (2004). The online disinhibition effect.
- Trust and Safety Professionals Association. (Accessed June 2024). Introduction to Trust & Safety.
Further reading
- ADL Center for Technology and Society. (2024). Addressing extremism in online games through platform policies.
- Ariely, D. (2008). Predictably Irrational.
- Bilewicz, M. et al. (2020). Artificial intelligence against hate: Intervention reducing verbal aggression in the social network environment.
- Borgatti, S., Everett, M., & Johnson, J. (2013). Analyzing Social Networks.
- Bregman, R. (2019). Humankind: A Hopeful History.
- Brown, D. J. et al. (2019). Human thriving: A conceptual debate and literature review.
- Coyle, D. (2018). The Culture Code: The Secrets of Highly Successful Groups.
- Giuliani, M. V. (2003). Theory of attachment and place attachment.
- Kilmer, E. (2023). Combating extremism in video games. Take This.
- Phillips, W. (2016). This Is Why We Can’t Have Nice Things: Mapping the Relationship Between Online Trolling and Mainstream Culture.