guide

Combating Online Child Sexual Exploitation and Abuse in Gaming

For children to truly thrive in online spaces, they need to feel safe and be safe. Unfortunately, while online child exploitation and abuse is illegal, it remains a pervasive threat. The Tech Coalition is an alliance of global tech companies working together to combat child sexual exploitation and abuse online.

This resource draws on the experience of Tech Coalition members. In joining the Tech Coalition, member companies have demonstrated their strong commitment to combating OCSEA and to their accountability for those efforts. 

Disclaimer: We understand all companies are different and may take different approaches. These recommendations are intended to help facilitate internal, cross functional conversations and do not intend to define a standard of care — you may wish to add or subtract from them as needed. This was not developed to provide legal guidance or regulatory compliance: We did not overlay or compare this information with current or drafted legislation. For the purpose of this content, child or minor refers to someone who is under 18 years of age.

What Is Online Child Sexual Exploitation and Abuse (OCSEA)? 

OCSEA refers to the usage of the internet or communication technologies to facilitate the sexual abuse of children and adolescents. 

While there is no universally agreed-upon legal definition for child sexual abuse material (CSAM), the term generally refers to sexually explicit imagery involving children (commonly known as child pornography). It is important to note that laws regarding CSAM can vary across jurisdictions, so be sure to consult with your legal teams for guidance on regulation and compliance. 

It is a misconception that in order for an OCSEA incident to occur on a website or app, children must be users of the app. This is not true, and many OCSEA harms occur between adults. In addition, children can commonly evade age assurance tools and tactics, obtaining access to online spaces meant only for adults.

Examples of OCSEA harm types that occur in gaming 

Posting, exchanging, and storing Child Sexual Abuse Material (CSAM)

  • Perpetrators might post CSAM in-game (e.g., in a chat or as a profile photo) or share a URL where CSAM is hosted off-platform. 
  • Perpetrators might exchange CSAM directly with other perpetrators in messages and chat rooms. 
  • Perpetrators might use a game’s cloud or file hosting solution to store CSAM. 
  • Streaming companies might also see content that may have been violative on another gaming platform uploaded to the streaming platform as a video.

Online grooming and online enticement

Online grooming can take many forms — below are four examples of grooming scenarios that happen on gaming platforms:

  • Traditional grooming — Can be hard to detect because it may initially appear like a benign friendship. These relationships will change, however, and some identifiers include: the perpetrator asking the potential victim to keep a secret, educating the victim on sex acts, and / or requesting the victim to send images of themselves or other minors. Other potential identifiers include:
    • Players lying about their ages (e.g., a player might say they are 35 to one player, and then say they are 16 to another player). 
    • Flattering comments to a child with a sexual undertone, such as “You’re so mature for your age.” 
  • Fishermen tactics — Perpetrators may start by “casting a wide net to see who bites.” For example, a player stating, “Hey I’m 15 looking for a 15-year-old girlfriend,” could be legitimate, but it is also a red flag for an adult pretending to be a teen.
  • Quid pro quo and gift giving — Can take the form of, “I’ll give you something if you give me something,” or “I’ll give you a gift” (i.e., in-game currency, console, gift cards). This can be a strong indicator of grooming or OCSEA and also lead to sextortion.
  • Moving off-platform — Asking the child to move to off-platform, in particular a peer-to-peer messaging platform, is another red flag for grooming.

For more information on grooming, please see the Tech Coalition’s Considerations for Detection, Response, and Prevention of Online Grooming.

Sextortion 

Sextortion is a form of child sexual exploitation where children are threatened or blackmailed, most often with the possibility of sharing with the public nude or sexual images of them, by a person who demands additional sexual content, sexual activity, or money from the child.

  • Financial sextortion — Perpetrators will threaten to share sexually explicit photos of the child unless they pay the perpetrator. Examples of payment methods include: in-game currency, peer-to-peer payment methods, gift cards, crypto, etc.  
  • Content sextortion — Perpetrators threaten to share sexually explicit photos of the child unless the child provides more sexually explicit photos. 
  • Extortion from violent extremist groups — Violent online groups will groom, then extort, minors to self-harm and produce CSAM. As part of the extortion, they might also require the victim to victimize others. See the FBI’s PSA September 12, 2023 for more information.

Live-streamed child abuse

Live-streamed child abuse, also known as online streaming, allows abusers to create child sexual abuse content in real-time. Livestreaming of child sexual abuse may include adult offenders, who direct the child abuse whilst the acts are streamed live to an audience or coerce children into using livestreaming platforms to produce child sexual abuse material.

Child sex trafficking

Child sex trafficking is a form of child abuse that occurs when a child under 18 is advertised, solicited, or exploited through a commercial sex act. A commercial sex act is any sex act where something of value — such as money, food, drugs, or a place to stay — is given to or received by any person for sexual activity.  

Minor sexualization 

Minor sexualization is the creation or sharing of content (including photos, videos, real-world art, digital content, and verbal depictions) that sexualizes real or non-real children. In gaming, this can typically occur in groups, profile photos (including avatars), and sharing of user-generated content. 

Sexual exploitation and assault in virtual reality

Players can interact with each others’ avatars in a way that invades someone else’s virtual space; this can lead to avatar-to-avatar violence, including sexual assault.  

Offender community coordination 

Offenders could use features within games, or create servers or specific games, that seek to normalize OCSEA or provide tips for how to groom and exploit a child. This can also be videos of games that promote CSAM and other OCSEA violations. 

Establishing operations in order to identify OCSEA

Create external standards that prohibit OCSEA 

An important practice for fighting OCSEA is to incorporate public-facing language that prohibits this behavior into a company’s external standards (e.g., their Terms of Service, Acceptable Use Policies and / or Community Guidelines). 

For more information, companies can sign up for the Tech Coalition’s Pathways program to check out the External Standards that Prohibit OCSEA resource. 

Create internal child safety guidelines that outline how to identify and take action on OCSEA 

To facilitate identifying and responding to OCSEA, companies can: 

  • Create internal principles to gather definitions and document enforcement guidelines. 
  • Work with legal and compliance teams to ensure policies align with newer regulations, such as the E.U.’s Digital Services Act, U.K.’s Online Safety Act and the U.S.’s REPORT Act
  • If a gaming company has multiple studios or works with third party game developers, adopting a centralized policy structure, where policies are extended across all studios and developers, can help increase efficiency and support regulatory compliance. 

For more information, companies can sign up for the Tech Coalition’s Pathways program to check out the Writing a Child Safety Content Policy resource. 

Design processes and build detection to surface cases 

Establish user reporting specific to OCSEA

  • Mechanisms must be in compliance with local legal requirements and made available to users of any company’s product(s) to enable reporting or flagging of illegal or harmful content and / or behavior. 
  • A common guiding principle is to have user reporting as easily accessible as possible. Ensure user reporting is always available, across all games and during all game play. 
  • Understand the types of OCSEA harms that will be reported and include specific child safety reporting options. 
  • It’s especially important to ensure user reporting is in place when new features or products are launched, as they might be used for abuse.

Note: If a gaming company allows users to report content directly to a channel / server or game moderator, there is a risk that OCSEA harms will not be reported to your company. Ensure these moderators are trained on how to identify and respond to OCSEA content; also ensure that proactive detection is in place in order to proactively surface cases (see next section). 

Establish proactive detection

  • Current technology solutions enable companies to detect OCSEA on their platform. The most common industry solutions in gaming include: cryptographic hash-based (i.e., exact match) detection, perceptual hash-based (i.e., near match) detection, image / video classifiers, text classifiers, keyword-based detection for high-risk grooming terms, and URL blocking.
  • Implementing detection capabilities may also require assessing additional needs for human review, so planning for moderation teams should be considered, if necessary.

For more information, companies can sign up for the Tech Coalition’s Pathways program to check out the Introductory Overview of Hashing and Matching for CSAM Detection resource.

Join signal sharing initiatives

Gaming companies frequently see bad actors using multiple platforms to conduct abuse. For example, a bad actor might use a gaming platform to find and groom victims and then use another platform to conduct exploitation. 

Taking action on a bad actor on one platform — but not on another — means that the abuse will most likely continue. To combat this, companies participate in signal sharing, including the Tech Coalition’s Lantern program:

  • Lantern is a voluntary program and only available to industry.  
  • Signals submitted and ingested must align with the program’s taxonomy. 
  • Participating companies must independently review signals against their own policies and terms of service. 
  • Lantern contains two primary categories of signals: content-based signals and incident-based signals.

Taking action when an OCSEA incident occurs

Report the case to the appropriate hotline based on your company’s location

  • For U.S.-based companies — The National Center for Missing & Exploited Children (NCMEC) operates the CyberTipline, a national clearinghouse for leads and tips regarding child sexual exploitation. Once a company is registered with the CyberTipline, it can report OCSEA incidents to NCMEC via the CyberTipline Reporting API or the CyberTipline Manual Reporting Form. In addition to mandatory reporting obligations, companies must also comply with data preservation requirements.
    • To create an account with the CyberTipline and register as an ESP (Electronic Service Provider), a company should contact NCMEC at [email protected] or visit https://esp.ncmec.org/registration
    • The REPORT Act was enacted in May 2024 and requires companies to report child sex trafficking and online enticement to The National Center for Missing & Exploited Children’s (NCMEC) CyberTipline. NCMEC released guidelines for online platforms to help them comply with a new law in October 2024. 
    • For more information, companies can sign up for the Tech Coalition’s Pathways program to check out the CyberTip Reporting Considerations resource.
  • For companies based outside the U.S. — INHOPE serves as the umbrella organization for a global network of hotlines working to combat OCSEA. To locate the appropriate hotline, select your country and INHOPE will redirect you to the relevant national hotline.

Enforce on your platform 

Enforcement is the action a company takes when an account or content violates a company’s policies. There are various options for enforcement, depending on a company’s business and capabilities. 

  • Some examples of enforcement include: content removal, full termination or permanent suspension of an account, ban of a player’s device and / or console, etc.  
  • Note that many gaming companies have cross game or cross service “time-outs” — so if a player gets a time-out from one game they will also get blocked from using the company’s other games too. 

For more information, companies can sign up for the Tech Coalition’s Pathways program to check out the Writing a Child Safety Content Policy resource.

Prepare for law enforcement to respond with legal process

An OCSEA report sent to NCMEC, after their prioritization and review, may be investigated by law enforcement. The law enforcement agency may follow up with the company to request additional data. The additional information provided may enable them to locate and safeguard a minor victim, collect the necessary evidence against the bad actor, etc.

Prevention techniques and raising awareness

Identifying and taking action on OCSEA is important and necessary, and often can lead to real world action — either through arrests or victim safeguarding. However, it is also imperative that gaming companies develop a strong prevention program.  

Prevention programs can have real world impact on preventing OCSEA and providing resources and support to those in need. 

Safety by Design 

Safety by Design is a proactive approach that focuses on preventing harm by incorporating safety into the design and development of products, services, and infrastructure. By implementing Safety by Design techniques and processes, a company can prevent harm before it occurs.

Companies will typically start by identifying features that may increase the risk of OCSEA harms. For example, does the feature:

  • Have multiplayer functions or social interactions? Are these between friends or open to anyone?  
  • Increase the discoverability of the player?
  • Enable real-time video or livestreaming?
  • Offer the ability to upload and share user generated content?
  • Include voice chat or video chat?
  • Target specific audiences that are at higher risk of online abuse?
  • Cover elements of an image so that illegal content might evade detection?

Please refer to the Tech Coalition’s Assessing OCSEA Harms in Product Development resource for more details.

Transparency reports 

Voluntary transparency reporting refers to reports that explain a company’s approach to addressing OCSEA, which should highlight the company’s policies, explain its processes, and document the outcomes of its efforts.

  • Transparency reporting is an essential component of industry efforts to combat OCSEA, and can also be a legal requirement. It drives accountability and plays a critical role in building trust with users, regulators, and the general public. 
  • Please refer to the Tech Coalition’s TRUST Framework for more details on how to develop a Transparency Report. Please also consult with your legal teams for additional insight on regulation and compliance in this space. 

Safety education

Safety education is a broad term meaning any education to users within your website or app. It can include help center articles about how to spot signs of OCSEA, in-product tips to children when they are in a high risk situation (e.g., someone has asked them to send a photo), or act as deterrence for potential offenders (e.g., reinforcing that CSAM is illegal and promoting helplines to users who search it).  

Acknowledgments

Special thanks to Tech Coalition Members and Partners for providing insights and feedback that contributed to the development of this guide.

Now what?

To learn more about these topics and others, sign up for the Tech Coalition’s Pathways program, which provides expert advice, resources, and opportunities for the tech industry to further build capacity to combat online child sexual exploitation and abuse.

NEWSLETTER

Get Playbook updates in your inbox!

Agreement(Required)
Pattern