When extremism goes mainstream
From gaming to music streaming, extremist content is spreading. Fresh policy and practitioner guidance, including on design, offers solutions
Even though some social media platforms have dialed back investments in fact-checking, none have officially rolled back policies on terrorist and violent extremist content (TVEC). But while moderation efforts have somewhat contained extremism on social media, extremist groups are adapting. They are spreading their content—and profiting from it—through less scrutinized platforms such as music streaming services, gaming marketplaces, and even IMDb.
This shift is not accidental; it is by design. As recent reports by the Global Network on Extremism and Technology’s Extremism and Gaming Network and the Extremism and Gaming Research Network (EGRN) reveal, extremist actors are leveraging these platforms to circumvent content moderation, spread their ideology, and monetize their influence. This raises an urgent question: how do we regulate an extremist ecosystem that thrives across multiple platforms?
40 years of gaming and extremist content
The Extremist and Terrorist Games Database (ETGD) is the most comprehensive effort to document how extremists exploit gaming. Covering more than 40 years of extremist content, it tracks 155 extremist-affiliated games, from standalone titles to modifications (mods) and browser-based games.
This dataset reveals not only the historical evolution of extremist gaming but also the sophisticated monetization strategies used by extremists. It highlights the shift from underground extremist networks to mainstream digital platforms like Steam, YouTube, and IMDb, where content can reach wider audiences with little oversight. The ETGD is a crucial resource for policymakers and tech platforms aiming to disrupt these networks before they scale.
Gaming the system
1. YouTube: playthroughs, hidden links, and algorithmic amplification
YouTube remains a key platform for extremist gaming content. Many games catalogued in the ETGD have corresponding playthroughs and walkthroughs on YouTube, often with embedded download links in the description or comments. Some creators use coded instructions instead of direct links to evade detection. The result? YouTube’s algorithm helps funnel users toward these spaces, even when the content appears benign.
2. Monetizing extremist content through music streaming
Music streaming platforms like Spotify, Apple Music, and Bandcamp have become unexpected revenue streams for extremist game developers, according to EGRN. The white supremacist games Angry Goy and The Great Rebellion have official soundtracks on these platforms, allowing extremists to earn money while normalizing their ideology. As an EGRN researcher notes:
“These gaming soundtracks aren’t just about music; they are Trojan horses for radicalization, designed to bring extremist ideas into new spaces where they may not be immediately flagged as dangerous.”
3. Gaining legitimacy with IMDb
Extremist game developers are also using IMDb (Internet Movie Database) to make their content appear more credible. IMDb, primarily known for movies and TV shows, also lists video games—providing details on developers, voice actors, and publishers. Some extremist games, like Tyrone vs. Cops and Alex Jones: NWO Wars, have been added to IMDb, complete with ratings, descriptions, and links to download locations.
By appearing on IMDb, these games gain:
Legitimization – Being listed on a mainstream entertainment database makes them appear commercially viable.
Discoverability – Users looking for information on the game may stumble upon extremist communities discussing how to access it.
The use of gaming but also fashion and music to ‘mainstream’ extremist ideologies in Germany was captured in Cynthia Miller-Idriss’ 2018 book When the Extreme Goes Mainstream. A similar phenomenon is now happening globally, facilitated by digital platforms that lack robust moderation mechanisms.
4. Steam: downloading hate
Steam, the world’s largest PC gaming platform, has long struggled with extremist content. A 2024 study by the Anti-defamation League found over 1.8 million pieces of extremist and hateful content on Steam. The EGRN/GNET report highlights an additional layer: game soundtracks sold as standalone products to appeal to broader audiences.
For example, Feminazi: The Triggering and The Great Rebellion both have soundtracks for sale on Steam. The popularization of these ideologies is increasingly linked with online misogyny, which according to research from the Institute for Strategic Dialogue highlights, can serve as a gateway to broader radicalization. The GNET study notes that some of the most violent titles, including Westmen (2023), actively promote anti-feminist, white supremacist, and violently misogynistic narratives.
Why this is a design problem
The spread of extremist content across multiple platforms is not just a policy failure—it is a design issue. Many platforms have been built to maximize engagement and monetization, often at the expense of safety.
Unlike social media, where moderation policies have matured over the years, gaming marketplaces, music streaming services, and entertainment databases are lagging. Their design allows extremist actors to:
Evade Moderation: By using platform-hopping techniques, extremists ensure that even if one platform removes their content, others will host it.
Generate Revenue: Monetization models allow extremists to profit from game sales, ad revenue, and streaming royalties.
Expand Their Audience: Without adequate oversight, extremist content is increasingly reaching younger audiences who may not actively seek it out but encounter it through gaming, music, or entertainment platforms.
Wanted: Cross-platform action
Addressing extremist content requires a cross-platform approach. In February this year, the Royal United Services Institute (RUSI) published two guides with clear and actionable guidance for policy makers and practitioners: the RUSI Gaming Policy Recommendations and the Implementing Positive Gaming Interventions Toolkit. Amongst the policy recommendations are:
✅ Prohibiting and Preventing Extremist Exploits: Platforms should implement robust moderation strategies to detect and remove extremist content proactively. This includes better AI tools and increased human expertise in trust and safety teams.
✅ Flagging and Removing Extremist Titles: Platforms should use hashing technology to detect and block extremist game downloads and music uploads across platforms.
✅ Improving Reporting Mechanisms: Platforms like Steam should introduce dedicated reporting categories for violent extremist and terrorist games, allowing users to flag content more effectively.
✅ Enhancing Cross-Platform Intelligence Sharing: Collaboration between gaming companies, music streaming services, and tech companies can help close loopholes that extremists exploit.
✅ Regulating Monetization Channels: Platforms must strengthen content monetization policies to prevent extremists from profiting off hate.
What gaming companies are doing right
Roblox has partnered with Tech Against Terrorism to remove extremist content from user-generated games.
Twitch has implemented AI moderation to combat white supremacist activity.
Ubisoft, in collaboration with Moonshot, has launched a program targeting radicalized players, offering counter-narratives and off-ramps.
A GNET researcher highlights:
“There is a clear path forward: gaming companies need to see themselves as stakeholders in counter-extremism, just as social media platforms have started to do. Otherwise, extremists will continue exploiting these spaces unchecked.”
The challenge is not just about content removal—it’s about designing platforms that are resilient to exploitation in the first place. To truly tackle the issue, tech companies, regulators, and civil society must push for proactive design changes that disrupt extremist networks before they scale.
Lena Slachmuijlder Co-Chairs the Council on Tech and Social Cohesion.