Blog
23.03.2023

Are gaming platforms the perfect environment for extremist recruitment?

Lax moderation standards on video game platforms have often failed to quickly remove groups with extremist beliefs. What are the consequences?

This article is part 2 in a series of blog posts about the video game industry. Check out the first post for more information on video game platform moderation and governance.  

The human drive to play has coupled the development of video games to that of digital computers and interconnectivity. This is for good reason, as connecting with others to play games online has many positive impacts on people, from improving one’s mood and finding new interests, to strengthening cognitive skills like memory and problem solving. During the pandemic, people reported making friends through online games and feeling a sense of community and belonging, data which counters the long-held notion of the socially isolated gamer. In fact, 70% of gamers play with a friend and millions of people play together in massively multi-player online (MMO) games. Roblox and Fortnite, two MMOs popular with children and teenagers, have over 220 million active users a month; just two examples from a pool of wildly popular games.    

Positive notions of gaming have long been overshadowed by discussions about violent games which motivated many researchers to find out the impact of these games on behaviour. Till now, research cannot point conclusively to a correlation between violent acts or an increase in aggression and playing violent video games. The public debates still had a valuable impact in directing the attention of regulators towards the growing industry. Primarily focused on the wellbeing of children, different governing bodies around the world began to implement measures. Today, 35 countries in the EU require age classifications for all games and, in the US, the self-regulatory body Entertainment Software Rating Board (ESRB) provides the same information. It seems however that protection for children falls well below the mark because online features of games still operate in a wild west-like environment. A recent report from the Bracket Foundation highlights the astonishing risk children are exposed to online, with significant increases in reports of grooming, cyberbullying and sexual exploitation. The report cites the lack of liability companies face for dangerous behaviour on their platforms and lax regulatory infrastructure for video game companies for the increase in exploitative occurrences.   

Violent extremists play video games too 

These is no clear source for the lack of attention regulators have to paid online gaming platforms in comparison to social media platforms. There has been little pressure on gaming platforms to shore up content moderation standards, and they are years behind social media platforms in their development of company policies and expert safety teams. Extensive amounts of harmful content persist and seem to be on the rise, leading to cause for concern over the presence of violent extremist and terrorist groups on these same platforms. The question is glaringly large: How well are these groups able to exploit lax oversite for the purpose of organisation and recruitment?  

The United Nations Office of Counter-Terrorism (UNOCT) conducted exploratory research into the connection between gaming and extremism and found that lax moderation allows for violent extremist content to circulate but cited a lack of research into the topic as a barrier to understanding the full extent of the problem. Interpol has identified a growing trend in the use of video games, gaming platforms and channels to spread extremist propaganda, particularly among young people. Platforms like Discord, Steam and Twitch, which don’t directly host games but are used extensively by the gaming community, have been found to be safe spaces for young people who are curious about these ideologies to network. Additionally, these online spaces serve extremist groups by hosting communities which are dedicated to trolling minorities, facilitating the planning of offline events, and livestreaming extremist individuals. For example, organisers of the far-right Unite the Right demonstrations that turned violent in Charlottesville, Virginia used Discord to organise and promote the events. Violent extremist groups reportedly use the anonymity and surveillance gaps in gaming environments to garner attention and direct people to connect further on other more secure platforms.  

Trends seem to show that gaming platforms are not used by extremist groups to exclusively proliferate and recruit but first and foremost to connect with one another. Many communities online form out of like-minded individuals and those with extreme beliefs also come together through their shared interest in gaming. Recruitment to violent extremist beliefs comes partially as a by-product of their presence in gaming spaces and it is not known how often this occurs. What is clear, however, is that video game platforms have unintentionally created an ecosystem that is viable for violent extremist groups to survive and connect, and suitable for the diffusion of propaganda.  

The presence of extremist content in gaming is worrying on two fronts. Firstly, many children are online playing games in environments not designed with their safety in mind and secondly, the rise of the metaverse means that more socialising will happen online therefore creating more incentive to exploit the spaces for recruitment. The metaverse is today an amorphous concept, but it will likely develop into social media with a 3D interface, representing a greater convergence of gaming and social media.  

Regulators need to hold gaming platforms accountable  

Video games have seemingly fallen out of the scope that regulators see under their purview, even though gaming and social media platforms face very similar challenges under the law. The regulatory landscape for what constitutes legal speech is varied and complex, and the presence of violent extremism and hateful content extends across gaming and social media platforms alike. The challenges are similar: vague borders between free speech and incitement, the tricky employment of algorithmic content moderation, cultural and linguistic nuance of the user base, and the sheer volume of content. Novel forms of digital content makes moderation in games particularly difficult, when, for example, harmful content or propaganda may be displayed in environments or through nonverbal communication by avatars.    

New technologies in content moderation are being developed and tested and the European Union’s Digital Services Act (DSA) outlines more clearly than ever before the responsibilities that online platforms have to their users. Video gaming companies are investing more heavily in policy teams and enforcing community guidelines, an appropriate response for the trends in the sector and possible increasing liability. The DSA encompasses video game platforms and is expected to increase protection for children online with better age-assurance solutions and the requirement of very large platforms to analyse systemic risks to children. Additional positive measures are possible under the DSA, but enforcing it requires expertise and capacity as well as motivation to target technology not spelled out within the act. Regulators also need to be prepared for the changing nature of games which would entail closer cooperation with the industry.  

The video game industry has truly skated under the radar of regulators across the world, and the DSA only covers the European market. The causes for this are unclear, but the result is an environment that is not incentivised to protect children or remove violent extremists except as reactionary measures. The intersection of gaming and extremist recruitment is under-researched which leaves many gaps in the understanding of the problem and how it should be tackled. There has been an increase in violent attacks from extremist groups in recent years, and foot dragging from gaming companies and a reluctance from regulators to legislate will allow these dangerous ideologies to continue to find a platform.   

 

Further Reading: 

Lakhani, S. (2021). Video gaming and violent extremism: An exploration of the current landscape, trends, and threats. Luxembourg: Publications Office of the European Union.  

RAN (2020). ‘Extremists’ Use of Video Gaming – Strategies and Narratives’. Conclusions Paper. Radicalisation Awareness Network, European Commission.  

 

Teaser photo by Emily Wade on Unsplash