Related Resources:
Tell Gaming Companies Hate Is No Game
The new report found that three out of four young people have been exposed to hate-fueled harassment.
Executive Summary
Hate and harassment in gaming is now so pervasive that it has become the norm for many players. An estimated 83 million of the 110 million online multiplayer gamers in the U.S. were exposed to hate and harassment over the last six months. Three out of four young people (ages 10-17) experience harassment when playing video games. And it continues to grow and spread across generations of gamers, despite some movement toward countering extremism and a growing trend of more aggressively tackling voice chat moderation alongside the beginnings of transparency efforts by major companies. When players pick up their phones or sit down at their PC or console, they now expect to experience hate- antisemitism, racism, misogyny, homophobia and more on a regular basis.
In an interview with a researcher from the ADL Center for Technology and Society, an adult gamer described how he responded to his son experiencing a painful incident of hate in an online multiplayer game:
“’That's how it is, you know?”’ I try to teach them young, ‘cause … Now he wants to be a gamer, I have to teach him how it goes.”
Normalized harassment and desensitization to hate frame the reality we find in ADL’s fifth annual survey of hate and harassment in online multiplayer games. And this—coupled with new trends in exposure to white supremacist extremism in online multiplayer games, including potential evidence of white supremacist recruitment— create an urgent challenge for the future of online multiplayer games that must be addressed by all stakeholders, from industry and government, to civil society and the broader public.
Key findings:
- Young people experienced more harassment this year. Three-quarters of teens and pre-teens experienced harassment in online multiplayer games in 2023, up from 67 percent in the previous year. Harassment of 10–17-year-olds based on identity increased to 37 percent from 29 percent in 2022.
- Women and Black or African American gamers were the most harassed because of their identity in online multiplayer games. In every previous survey, women have been the group who far and away experienced the most harassment due to their identity in these spaces. Women saw a one-point increase to 48 percent of gamers harassed because of their gender.
- Half of Black adult gamers (50 percent) reported experiencing harassment in online games due to their race, an increase from 44 percent in 2022. This represents a 19-point increase in hate targeting Black adult gamers since 2020, when many game companies spoke out in support of Black Lives Matter following the murder of George Floyd at the hands of law enforcement.
- Though still far too high, the overall rate of harassment of adults in online multiplayer games decreased for the first time in the five years this survey has been conducted. 76 percent of adults experience harassment in games, down from 86 percent in 2022.
- 70 percent of Jewish adults experienced some form of harassment in online multiplayer games, though this data was collected before the 10/7 attack. This is 6 points less than adult gamers overall (76%).
- Exposure to white supremacist ideologies in online games decreased: Nine percent of young people were exposed to white supremacy in 2023, compared to 15 percent in 2022. And 15 percent of adults were exposed to white supremacy in 2023, compared to 20 percent in 2022.
- Despite a decrease in overall exposure to white supremacy, players exposed to this content see it regularly. 30% of adults and teens exposed to hateful and dangerous ideologies had these experiences on a weekly or more frequent basis.
- 20% of players (adults and teens) are spending less money in online game spaces because of the hate and harassment they encounter. This year for the first time CTS evaluated the influence of hate and harassment on how players spend their money in online multiplayer games.
What’s in this Survey
Since 2019 the ADL Center for Tech and Society, in collaboration with the gaming analytics firm Newzoo, has conducted an annual nationally representative survey on hate and harassment in online multiplayer games in the U.S. For this year’s report, Newzoo recruited 1,971 respondents as representative of the US gamer population aged 10-45. This group then took the survey between August 4 and August 17, 2023.
Survey topics include the overall harassment experienced by gamers (both young people and adults), gamers’ sense of safety across a variety of online multiplayer games, and the identities most targeted by harassment in online games. The specific games in which players experienced harassment were also examined, along with player exposure to certain types of extremist rhetoric and misinformation. Finally, the survey covered the impact that harassment had on gamers both in-game and offline, as well as players' experiences of reporting hate and harassment to game companies.
As part of this research, CTS conducted 10 qualitative interviews with respondents to gain a better understanding of the experiences of targets of hate, harassment, and extremism in online games. These qualitative insights are included alongside the quantitative survey data to give a greater depth of understanding to the numbers. In one of these interviews, for example, a respondent shared details of what they believe to be extremist recruitment in an online game, similar to other accounts of extremist recruitment in online game spaces.
In focusing on online multiplayer games, this report offers concrete guidance for the government, civil society, and industry to take meaningful steps in making those games safer for all users, regardless of their identity.
Overview
The Gaming Industry in 2023
The number of video game players worldwide was expected to reach 3.38 billion in 2023, a six percent increase year on year, according to Newzoo’s annual game industry market report. Newzoo also estimated the video game industry’s revenue to be $184 billion in 2023, almost unchanged from 2022. This figure approximately rivals the gross domestic product of Kuwait. However, despite this growth in players globally, the game industry has seen massive layoffs in 2023 with nearly 7,000 workers losing their jobs. In January 2024, more layoffs resulted in nearly 6,000 additional game workers losing their jobs. One reason for this development is a perceived overinvestment by industry leaders following the massive growth during COVID lockdown, and a subsequent return to more subdued growth levels.
These layoffs were reflected in the broader technology space, as social media companies also axed staff—not least on ethics and safety teams. The impact of these layoffs on teams addressing hate and harassment in online multiplayer games is hard to gauge. But previous ADL research, based on interviews with leaders in safety work in the video game industry, noted the lack of resourcing these teams faced broadly even before these latest layoffs. CTS is deeply concerned about the potential impact staff reductions may have on ethics and safety teams.
Congress Shows Concern
Pressure for the industry to act on issues related to hate, harassment and extremism has continued to mount, however. Following the release of ADL’s online game survey in December 2022, we saw a growing concern about game company accountability in Congress.
Representative Lori Trahan of Massachusetts—alongside a coalition of congressional legislators—wrote to fourteen major game companies about their efforts to address hate and extremism in online games. Rep. Trahan’s oversight efforts occurred alongside efforts by Senator Maggie Hassan of New Hampshire to hold the game company Valve accountable for its role in allowing extremism to spread on its Steam online game store service.
In March 2023, Representative Trahan published a summary of the responses her office received from these game companies regarding their efforts to address hate and extremism on their platforms. This summary represents the first cross-industry public record of how game companies are addressing hate and extremism in their online games. Nine out of 14 companies notably failed to mention policies or actions they deploy to specifically assess and mitigate extremist content. The summary also included the first public mention of the dedicated counter-extremism team on staff at the game company Roblox, a significant investment which should be emulated by other companies.
Later that month, Senator Dick Durbin of Illinois further pressed seven video game companies on their efforts to address extremism, as well as urging Attorney General Garland and U.S. Department of Homeland Security Secretary Mayorkas to pay attention to threats in these spaces. Finally in June, Rep. Trahan offered an amendment to the National Defense Authorization Act (NDAA), which called for a commission to study connections between online multiplayer games and extremism. The step was noteworthy even though it was not adopted.
Pressure created by these oversight actions, combined with looming regulatory initiatives from the EU and UK, may have induced several game companies to make significant changes in subsequent months to address hate in their online multiplayer games. In May, Valve updated its policies for Steam, including an explicit prohibition against “encouraging or facilitating real world violence.” While this may seem like something that obviously should be forbidden on the largest online game store in the world, evidence of white supremacists on this platform as reported by ADL and others in the past make this a notable change for the platform. It will also be important to understand how and whether Valve enforces this change.
New Policies and More Transparency
ADL’s research shows that voice chat is still the main place where hate and harassment occur in online games. Hence we welcome the long-overdue announcement in August that major games publisher Activision Blizzard would be introducing new technology that allows for proactive moderation of hateful and harassing conduct in voice chat in Call of Duty, one of their online multiplayer games. Likewise, Xbox introduced reactive voice chat reporting across their consoles in July, and in November Epic Games introduced a similar feature in their game Fortnite.
Since our first survey of hate, harassment, and extremism in online games in 2019, ADL has urged the game industry to provide more visibility for the public into how they are addressing these issues. Xbox’s decision to produce two transparency reports in May and November this year deserve to be emulated by the industry at large.[1]
Notably only one company made any publicly visible change to their approach to curbing the spread of extremism in these spaces: In early 2024, Activision Blizzard updated the code of conduct for Call of Duty to include prohibitions against “the amplification of any person, agenda, or movement that promotes discrimination or violence based on the above.” This addition should be replicated across the industry.
Industry and Player Responses to 10/7
In previous years, ADL has tracked how the video game industry used its voice to speak out against the murder of George Floyd by law enforcement in 2020, the anti-Asian attacks in Atlanta in 2021, and the curbing of women’s reproductive rights decided by the U.S. Supreme court in 2022. In 2023, however, the video game industry declined to use their voice and influence to engage with the October 7 attacks in Israel and the subsequent war between Israel and Gaza. Not one of the companies we have tracked over the last three years made a public statement decrying the brutal attack by Hamas on October 7, the subsequent humanitarian crisis in Gaza, or the unprecedented rise in antisemitism and anti-Muslim hate in the U.S.
While video game companies have not spoken out about the Israel-Hamas war in Gaza, game developers and players themselves were far from silent. Pro-Palestine rallies were organized in the online game Roblox, while parallel pro-Israel rallies were organized in Minecraft. A letter written by members of the “Future Class” initiative of the Game Awards, an industry marketing showcase, called for the event to express support for a ceasefire and Palestinian human rights. However, their letter made no mention of the terrorist attack by Hamas on Israel. In the end the Game Awards remained silent, like all other institutions in the video game industry.
Data released by ADL in November 2023 found that over one in ten adult gamers (14%) were exposed to misinformation about the Israel/Hamas war in online multiplayer games. While the data in this present survey was collected in August 2023 before the attacks on 10/7, ADL is also conducting additional research on the spread of antisemitism in online games that will be released soon.
The results of this survey and the accompanying qualitative interviews show a slight decrease in harassment in some aspects of online multiplayer games, and some modest changes by companies to address harms. The video game industry, alongside all government, academia, media and civil society stakeholders, must use all the tools at its disposal to push back against hate and extremism on their platforms and in their communities, and not remain silent in the face of hate.
[1] Xbox’s transparency report does not provide data on how they are handling extremist or terrorist content. That data does exist but is inexplicably aggregated with other Microsoft consumer services, such as Onedrive, Bing and Skype in Microsoft’s Digital Safety Content Report. This makes understanding the scale and scope of extremist and terrorist content on Xbox impossible.
Results
Harassment
Though still exorbitantly high levels of harassment exist for all game players, in 2023 harassment against adults declined overall and across every category; concerningly, harassment of teens and pre-teens increased across nearly every category. The survey questions to adults focused on both the most severe forms of harassment (such as swatting and doxing) and less severe forms of harassment, while in general, teens and pre-teens were asked about less severe (though still harmful) forms of harassment. Jewish adults experienced less harassment overall than the average adult gamer (70% of Jews vs. 76% of adult gamers), though this data was collected before the 10/7 attack.
Qualitative Insights
As part of our research we included, as a first, qualitative interviews. Though we did not intentionally interview participants in a parental capacity, many adult gamers who were interviewed as part of the qualitative follow-up happened to be parents or had young people in their lives in a significant capacity. These participants expressed concern about the kind of content their children or nieces/nephews encountered in online games. Many of them assumed that games marketed to children would be moderated spaces and, therefore, safer spaces. Unfortunately, that was found not to be the case. Respondents monitored their children out of concern for the harassment they could be subjected to by other players online:
One father explained that when his son first started playing Fortnite he was quite new to gaming. The family had just moved, and their son did not yet have real-life friends:
“I just remember him trying so hard to make friends, but people were absolutely nasty to him. He wanted to continue playing, but my wife and I needed to have a little more control because it wasn’t just one incident … it was constant.”
Another parent also noted her concerns about Fortnite, which her son had been playing alone and unmonitored:
“When I went back on my son's profile to play for him, he had a provocative message from an older woman. I guess you could say soliciting herself. And I was just like, ‘Oh, my God.’ And my husband saw (this), and he was like, “See, that's why they can't play games like this.”
In response to harassment experienced by their children, some parents took time to explain to them how they could protect themselves online and how to deal with insults from other players.
“I try to teach them young, ‘cause … Now he wants to be a gamer, so I'm like, I have to teach him how it goes.”
Safety (by game)
In previous years this survey has asked gamers about their positive social experiences in online multiplayer games, such as making friends, finding community, and learning about themselves and others. Over the years, the answer to these questions has been that pretty much every player had some positive social experiences in online multiplayer games.
This year we considered how players express themselves in games. Only seven out ten replied affirmatively when asked whether players feel safe enough to express themselves fully in specific online games without fear of negative consequences. At the same time, about one in ten adult gamers (across all games) responded that they could never express themselves fully within the context of the games without fearing negative consequences.
Less competitive games such as Minecraft are, as expected, among the top games where players feel safe expressing themselves without negative consequences. Meanwhile, the games where players have most often experienced harassment in our previous research were also the ones where they feel safer expressing themselves, such as Dota 2 and Valorant. This may show that in the competitive atmosphere of some online multiplayer games, players feel safe to fully express themselves in ways that might not be appropriate in other spaces. This is similar to how people in an offline context may feel safer behaving a certain way while playing a sport or attending a concert than in their home life or work context.
The data here does not, however, establish whether people of certain identities feel more or less safe expressing themselves without negative consequences. It may be that those with historically marginalized identities, for instance, feel less safe to express themselves than others in certain game spaces.
Qualitative Insights
One online gamer shared how he takes steps to shield himself by curating his own experience. For example, he may avoid some games he believes have toxic environments. As he explained:
“.. you got the CounterStrike/Call of Duty type games where you know trash talk is like baseline … or more than I'm comfortable with elsewhere. "
This respondent also explained how he still prefers to stay in his comfort zone of massively multiplayer online role-playing games (MMORPGs) such as World of Warcraft and Final Fantasy XIV, despite acknowledging that he may be missing out on new experiences.
“I've been lucky in that my niche and the areas that I play in, it could be a comfort zone and a good crowd that doesn't go too extreme. I might be missing out but … I don't really feel like I'm missing out. So, I luck out that way.”
Identity based harassment
In every previous survey, women have been the group who far and away experienced the most harassment in these spaces. But for the first time since ADL began this survey in 2019, Black or African American gamers were harassed at a similar rate to women. Respondents with these identities were the most harassed in online multiplayer games. Black adults saw an increase to 50 percent of gamers harassed from 44 percent in 2022. This also represents a 19-point increase in hate targeting Black adult gamers since 2020, when many game companies spoke out in support of Black Lives Matter following the murder of George Floyd by law enforcement. Women experienced an increase to 48 percent of gamers harassed from 47 percent in 2022, within the margin of error.
For young people, the sample sizes of individual identity groups were not sufficient to make claims about the targeting of any individual identity group though we are able to reliably note an overall increase across all identity-based harassment.
Qualitative Insights
Our qualitative interviews conducted as part of this year’s research illustrate in more detail how identity-based harassment manifests in online multiplayer game spaces. Respondents generally expressed seeing harassing behavior as something to be expected in online multiplayer games. Players described being targeted by anti-Black racism because of their identity, but also hearing various anti-Black slurs being used in a derogatory manner toward them despite not being Black themselves.
“All of a sudden, you know, they're writing me cause I turn my mic off, but they'll be, they'll write me and they just they'll say a whole lot of horrific things in the cloud is like, call you a monkey, black monkey or you might be like a … or you must be a 'fucking' you know they there was a, say, a whole bunch of like racist things.”
“I'm a white male, so they're calling me racial slurs and stuff. It doesn't really bother me because it you know, you know if they're calling me like the N-word or something like that. I don't take offense to that because I'm not Black.”
Respondents also expected some level of gender-based harassment as the norm in the games they played; they viewed online games as places where misogyny was simply part of the experience. One interviewee, when asked how much sexual harassment she experienced, answered, “Not too much.”
“[O]f course this is they would hear that I'm a female. You know all the slurs, all of the lines that they go back to the kitchen . . . you know, all of that happens. Umm, I guess you could just say a lot of the negative experiences cause I'm a girl, you know, and I guess a lot of guy players don't like the [sic] girls are good at games, so they just get real hostile.”
Silencing themselves on voice chat is one the ways respondents shield themselves from abuse. A few female respondents relied on their partners to help defend them against hostile behavior when gaming. Some of the defensive measures respondents are forced to take to protect themselves curb their gaming experience. Avoiding using voice chat, for example, means that a player cannot communicate with their teammates quickly.
“I usually just mute the TV so I don't hear them. And that's kind of a disadvantage because then you can't hear when they're shooting at you...”
Others responded to identity-based hate by leaving games altogether on the grounds that remaining in these spaces required a kind of fortitude they simply did not possess:
“I got to a point where I just stopped playing those games online anymore because, you know, it doesn't matter how strong of a person you are. Doesn't matter how confident you are. Doesn't matter how brave… You carry them with you and what do you think about it or not, whether you are OK with it or not in the moment and can just roll your eyes and go along with it. You get to a point where you just stop enjoying things that you used to enjoy .. just to try to protect yourself...”
Harassment by Game
Harassment is a part of the environment in every type of online multiplayer game regardless of genre, our data shows. In no game played by adults or young people did fewer than 40 percent of players experience harassment –and these games include the full spectrum of potential play experiences from first person shooters to and strategy games, to sports games and sandbox games. The games where adults were most harassed in 2023 were Dota 2, Call of Duty and Valorant. The games where young people were most harassed were League of Legends, PUBG and Call of Duty.
This is also the first year that we examined how different types of harassment appeared in certain game titles. By asking about some of the most popular games in which players experienced harms, we were able to collect data on the degree to which severe forms of harassment appear in specific games. Of the seven popular games we collected data on, at least half of the players reported severe harassment in each.
Extremism and Misinformation
Overall, the exposure of adults, teens and pre-teens to white supremacy has decreased since last year, but the frequency with which players who are exposed are having these experiences is alarming. Building on last year’s troubling results about exposure to white supremacy in online multiplayer games (20% of adults and 15% of young people in 2022 vs. 15% of adults and 9% of young people in 2023) our survey this year asked several follow-up questions about the degree to which players were exposed to these toxic beliefs.
Within the context of this survey, ADL uses the term “white supremacy” to specifically refer to a collection of beliefs such as a conviction that white people are genetically superior to people of other backgrounds, that white people should dominate in all ways and exercise power over other identities, that whites should live separate from people of other races, and/or that “white culture” is expressly superior to other cultures and must be supported at the expense of other cultures. We are not referring to “white supremacy” as the historically based, institutionally perpetuated system of white dominance and privilege in the U.S., which enables and maintains systemic racism throughout all segments of society. Here, we specify white-supremacist ideology as a symptom and an outgrowth of systematic white supremacy.
Further research is required to understand to what degree exposure to these hateful ideologies results in them being normalized in an adult or a young person, but the fact that a third of adults, teens and pre-teens exposed to white supremacy have this experience on a weekly or greater basis should make the industry more serious about addressing this threat.
Holocaust denial in 2023 occurred at levels similar to previous years, with around one in ten gamers being exposed to this form of antisemitism in online multiplayer games. Finally, this year we also asked about rhetoric used in extreme anti-LGBTQ+ hate, with around one in ten adult gamers being exposed to these harmful and dangerous ideologies.
Qualitative Insights
Respondents who encountered extremist rhetoric, such as white supremacist ideas being discussed in games, said that the topics would often come up spontaneously during gameplay. Other respondents said that these topics would come up due to current events, such as presidential elections in the US. A few respondents said that they encountered white supremacist rhetoric and ideas most often in shooter genre games such as Call of Duty.
One respondent did experience what he described as attempts to recruit him in an online multiplayer game:
“They just do it like ‘H’ and eight like 8 and 80.[1] And they did, usually saying KKK's. You know, ‘we need to kill black people’? Certain like minority groups. ‘They, invading our territory,’ ‘They don't belong here,' you know? And just like they think their ideas are superior alright and it [sic] just like some racist stuff. Sometimes they want you to aim and join us. This is our website. This is our email, something like that. Their club name and stuff like that. They want to recruit. They said we were doing recruit, like boy, I don't wanna get in trouble.”
The interviewee went on to say that these encounters occurred in public chat in the online multiplayer games he plays. This is the first time in the five years ADL has conducted this research that anyone has said that they believe they have experienced extremist recruitment in an online multiplayer game. Previous instances in this research have shown individuals having experiences related to the expression of white supremacist rhetoric, but not recruitment specifically.
More research is required to better understand the nature of this activity, though the behaviors described by this respondent bear some similarities to the recruitment tactics in online games described in scholar Daniel Koehler’s From Gaming to Hating: Extreme-Right Ideological Indoctrination and Mobilization for Violence of Children on Online Gaming Platforms. In Koehler’s work, based on German police records, he describes how a group utilized the game platform Roblox to express their hateful ideologies and invite those who responded positively over a period of time to engage with them more deeply in private Discord servers off of the gaming platform. Given the fact that this is only one interview, we are cautious not to generalize the experience as a trend without additional research.
Impact
Data for 2023 showed a decrease across all age categories in the number of players who quit games as a result of hate and harassment. One reason may be that players are having better in-game experiences. Another may be that players’ tolerance for what constitutes harassment or hate is increasing. This is another area that requires more research.
We also asked players about the impact of hate and harassment on their in-game spending. In their 2023 global game market report, Newzoo noted that in-game spending will result in 97 percent of mobile games’ monetization revenue ($89.7 billion) and 27 percent of console games’ monetization revenue of ($15.3 billion). The fact that one fifth of adult and teen players are spending less because of the harassment they face should be of concern to the industry, especially given the importance of in-game spending on the overall revenue of online games.
Qualitative Insights
The overwhelming majority of respondents said that they played games to have fun and to relax. Experiencing harassment and other negative behaviors from other players took away from that experience. Some respondents said that negative experiences also caused them to have negative associations with games and play them less.
“I feel like the game is a representation of that hate or representation of that ideology that these negative people play it and… if I play it, I'm the same thing with these people.”
Experiencing harassment has also led respondents to change how they play games. In one case a respondent said she would play the game offline to avoid harassment, despite offline gameplay being less fun.
“I usually just end up playing a game that's, like, offline or where I don't have to deal with people and stuff like that. So [the playing experience] kind of… it kind of gets ruined ‘cause [you’re] not playing online, you're playing offline, but you don't have to deal with people messing with you and stuff.”
Another respondent said that she questions whether it is worthwhile spending money on a game in which she experiences harassment.
“There's sometimes where it changes whether we see it as worthwhile or not. Yeah, that definitely will make an impact on what we want to spend. Sometimes we just don't want to throw our money into something that we don't enjoy as much.”
Reporting
This year we asked about players' experience of reporting hate and harassment to the game companies who own and operate various online multiplayer games: only a third of adult, teen and pre-teen gamers report the hate or harassment they experience or see in online multiplayer games. These results are virtually identical to those we saw in 2020, when we last asked this question.
Two factors support the observation that harassment has potentially become normalized and accepted in online game spaces: the low levels of reporting, and the fact that many adults, teens and pre-teens indicated the reason they did not report was that they felt harassment was part of the gaming experiences. For young people especially, the difficulty of reporting harassment within a game also speaks to a need for game companies to reimagine how their reporting should function while centering the experience of targets of hate and harassment.
Qualitative Insights
Many respondents consider it tedious to report other players for offensive behavior. A few said they considered the reporting of other players as “snitching.” Even when respondents did report other players, most found the process to be opaque and confusing. When they were told that action was taken, they were rarely told how or what the outcome was.
“They don't send you an email, they just send you a message to the things saying that they got your report and that they will look into it and stuff like that.”
“We've only [reported behavior] a couple times and both times [the company] just said that [the player] got a warning. Most of the time we haven't run into the person that second time, but we're not sure if it's because they, like, did something else and got banned or if it's just because we weren't logging on to that server at the same time.”
“It hasn't gotten so bad that I, like, ‘Oh my God, I have to have to report it.’ But I said I've been happy with, you know, when I've reported players in Hearthstone and I said in terms of the behaviors I've gotten, responses back from Blizzards [sic] that they've done investigations and . . . they've taken appropriate action.”
[1] See ADL’s hate symbols database on 88 as a sign for “Heil Hitler”
Forecast for 2024
The growth of “live service games” among PC and console publishers is a key trend highlighted In NewZoo’s annual game industry market report. “Live service games,” also referred to as “games as a service” are games that provide game content on a continuous revenue model, e.g., via a game subscription service or a season/battle pass. Most of these games are also the type of online multiplayer games discussed in this report. Newzoo notes that while there is unmistakable growth in investment in these games, there is also a question of the degree to which this growth is sustainable given the expanding number of games in this space meeting a limited if significant number of players.
Newzoo’s forecast does not, however, mention the degree to which keeping live service games free from hate and harassment will be part of determining the sustainability of this trend. At ADL, we believe that this will be a key consideration. Investment in content moderation technologies, and robust policies including prohibition of extremism and inclusive design, will all have an impact on the degree to which players turn to (and spend their money on) one game as opposed to another, and perhaps even the degree to which a game company may face regulatory scrutiny.
Recommendations
Societal stakeholders must demonstrate their commitment to disrupting the intergenerational norms around hate and harassment in online multiplayer games. This will require substantial investment and active steps from policymakers, tech companies, educators, and families. While government is necessarily focused on the dangers posed by social media and AI, they must also dedicate attention to the immediate threats pervasive in online gaming environments.
Some companies in the game industry are taking incremental steps to make their games safer; however, all major gaming companies should be driving efforts to combat the normalization of hate in their gaming platforms, as well as in the gaming industry at large. Academia and civil society must also support efforts to decrease hate and harassment in online multiplayer games by conducting research and disseminating their findings on the threat that extremism—and, in particular, potential extremist recruitment—poses in these spaces. Caretakers and educators—whether they were raised as gamers or not—must guide young gamers on best practices for safety in online multiplayer spaces and prioritize the safety of their children and students by implementing parental controls.
In the aftermath of the horrific 10/7 attack by Hamas in Israel and the ensuing war, and with the upcoming 2024 US election, the threat of hate, harassment, misinformation, and extremism online and offline will only become more dire globally. This is especially true for online gaming spaces.
ADL recommends different stakeholder groups take the following steps to play an active role in promoting online safety and mitigating the risk of these dangers and their impact on users:
For government
- Prioritize transparency legislation in digital spaces and include online multiplayer games. States are beginning to introduce, and some are successfully passing, legislation to promote transparency about their content policies and enforcement on major social media platforms (see California’s AB 587). Legislators at the federal level must prioritize the passage of equivalent transparency legislation and include specific considerations for online gaming companies. Game-specific transparency laws will ensure that users and the public can better understand how gaming companies are enforcing their policies and promoting user safety.
- Enhance access to justice for victims of online abuse. Hate and harassment exist online and offline alike, but our laws have not kept up with increasing and worsening forms of digital abuse. Many forms of severe online misconduct, such as doxing and swatting, currently fall under the remit of extremely generalized cyberharassment and stalking laws. These laws often fall short of securing recourse for victims of these severe forms of online abuse. Policymakers must introduce and pass legislation that holds perpetrators of severe online abuse accountable for their offenses at both the state and federal levels. ADL’s Backspace Hate initiative works to bridge these legal gaps and has had numerous successes, especially at the state level.
- Establish a National Gaming Safety Task Force. As hate, harassment, and extremism pose an increasing threat to safety in online gaming spaces, federal administrations should establish and resource a national gaming safety task force dedicated to combating this pervasive issue. As with task forces dedicated to addressing doxing and swatting, this group could promote multistakeholder approaches to keeping online gaming spaces safe for all users.
- Resource research efforts. Securing federal funding that enables independent researchers and civil society to better analyze and disseminate findings on abuse is vitally important in the online multiplayer gaming landscape. These findings should further research-based policy approaches to addressing hate in online gaming.
For companies
- Implement industry-wide policy and design practices to better address white supremacy and extremism. All games companies should implement and enforce comprehensive policies prohibiting extremist and terrorist content and recruitment on their platforms. They should work with trusted industry partners and extremism experts, such as ADL, to ensure these policies’ effective design and implementation across their platforms.
- Improve reporting systems and support for targets of harassment. Games companies cannot stop at creating policies against online abuse: they must also enforce them comprehensively and at scale. Having effective, accessible reporting systems is a crucial part of this effort as it allows users to flag abusive behavior and seek speedy assistance. Companies must ensure they are equipping their users with the controls and resources necessary to prevent and combat online attacks. For instance, users should have the ability to block multiple perpetrators of online harassment at once, as opposed to having to undergo the burdensome process of blocking abusive users individually. Games companies must connect individuals reporting severe hate and harassment to human employees in real-time during urgent incidents.
- Strengthen content-moderation tools for in-game voice chat. Often, bad actors in gaming spaces can evade detection by using voice chat to target other players in online games. To date, the tools and techniques that detect hate and harassment in voice chat trail those that moderate text communication (though some companies have made progress in the last year). Although this requires an investment of resources, companies must address this glaring loophole to ensure their users’ safety.
- Release regular, consistent transparency reports on hate and harassment. Unlike mainstream social media companies, games companies—with few exceptions such as Xbox and Wildlife Studios—have not published reports about their policies and enforcement practices. Transparency reports should include and share data from user-generated, identity-based reporting, aggregating data around how identity-based hate and harassment manifest on their platforms and target users. For guidance, companies can consult the Disruption and Harms in Online Gaming Framework produced by ADL and the Fair Play Alliance, which provides a set of common definitions for hate and harassment in online games.
- Submit to regularly scheduled independent audits. Games companies should allow third-party audits of content-moderation policies and practices on their platforms so the public knows the extent of in-game hate and harassment. While transparency legislation efforts are still underway, games companies should voluntarily equip independent researchers and the public with access to these essential data. In addition to promoting accountability, independent audits pave the way for games companies to assess areas for improvement with data-based findings.
- Include metrics on online safety in the Entertainment Software Rating Board’s (ESRB) rating system of games. The ESRB provides information about a game’s content, allowing adults, parents, and caretakers to make informed decisions about the games that their children may play. Although game companies caution players that the gameplay experience may depend on the behavior of other players, no components related to the efforts of online games to keep their digital social spaces safe exist in the ESRB ratings at present. The ESRB should devise and implement these kinds of ratings across all the online games they review.
For Civil Society
- Support the research of games scholars and practitioners. In much the same way that we encourage government to resource gaming research, we encourage civil society and other industry partners to keep abreast of these studies’ findings, to partner with research institutions on best practices, and to coalesce around the development and promotion of industry standards.
- Elevate and center the voices of impacted communities. In efforts to combat online hate, game design advocacy organizations should amplify the concerns of historically marginalized groups and the organizations that represent them. Companies must consider these groups’ experiences when designing policies and features that contribute to their gaming experience.
- Support educational efforts for young gamers and adults in their lives. Research on youth and social media offers models for addressing hate and toxicity online, including designing better tools and interventions. Peer education and support are critical to teaching youth to navigate digital tools and educating parents and other adults.
For Families and Educators
- Learn about and make deliberate, proactive choices about safety controls. Adults should make informed decisions about whether to allow young players to have in-game communications or block conversations with strangers via voice or text chat. They should also learn how to implement safety controls, especially for younger gamers.
- Pressure games companies to improve parental safety controls and education. Safety controls must be transparent and user-friendly for parents and other adults who may not know much about online games or toxicity. Parental accounts should accompany young users. Parents should also be able to ban messages from strangers or access to chat unless adults opt in and enable approval to add friends or contacts.
- Familiarize yourself with the games young people in your life play. Adults should have meaningful conversations with young people about their experiences in online games. Spending time playing games with young people can increase adults’ understanding of the enjoyment and harms of online games.
Appendix
Detailed Methodology
Newzoo recruited 1,971 respondents to be representative of the US gamer population aged 10-45 on a national level (according to age, gender, income, education, and region). The sample, who took the survey between August 4 and August 17, 2023, was then validated by comparing it to census data, ITU numbers (internet penetration), and Newzoo’s Consumer Insights Data on gaming behavior in the US. This data, along with the census data available, gave us insight into the demographic distribution of what the minority group variables (ethnicity, religion, sexual orientation, and ability status) should look like in the final dataset.
Newzoo then oversampled on key interest groups, including Jewish, Muslim, LGBTQ+, Black / African American, and Hispanic / Latino, to ensure we had enough respondents belonging to these minority groups to be able to examine responses within these groups during analysis.
The final dataset (including the over-sampled population, n=2,006) was then weighed, so that the demographic distribution of our final dataset matched the demographic distribution of our initial gamer sample. This involved re-applying weights based on five variables - age, gender, ethnicity, religion, and sexual orientation using a rake algorithm* to give each respondent a weight. Newzoo combined data from the UN Census, the International Telecommunication Union (ITU), Newzoo’s Consumer Insights Data Base, US Census Bureau, the Pew Research Center, and the CDC.
When weighing data it is important not to give one respondent a weight that is too high (as this would result in the data relying too much on a single individual’s response) or too low (which would result in the data basically ignoring that individual’s response). Therefore, weights were trimmed to a min=0.3 and a max=3 (industry standard).
Finally, key gaming behavior metrics were then compared again with secondary data sources, to ensure that the weighted sample was still representative of the US Gamer population aged 10-45 and that the observed gaming behavior was consistent with Newzoo’s Consumer Insights Research.
Data was cleaned according to the following factors. Approximately 5% of total responses were removed from the sample.
- Time: Respondents who complete the survey too fast or too slow get filtered out (separate brackets were used for online multiplayer and non-multiplayer gamers and for children who preferred not to answer a few questions). Times were based on standard deviation (distance from the average time).
- Repetitive / flatlining responses (grid validation): when respondents answer questions without reading. This is usually expressed by consistently answering the same options presented in the survey.
- Inconsistent and incoherent answers: often expressed by providing responses that strongly mismatch with earlier answers, or by answering open-text responses in a nonsensical way.
The estimated margin of error is +/- 2%-3%, with 95% confidence interval when using the full sample.
Game Industry Business Model Described
The predominant business models for online multiplayer games can be categorized into two main types. The first is the Annual Release Live Service, where publishers launch a new full game each year, sustaining player engagement through frequent small content updates and continuous monetization via in-game spending. Some examples of this model are EA Sports FC 24, Call of Duty, and Battlefield.
On the other hand, the Long-Term Live Service model, seen in games like Fortnite, League of Legends, and Counter-Strike 2, generally offers free-to-play experiences with either no sequels or ones appearing after 5+ years. Player retention is achieved through regular small updates and one to three major annual updates, and monetization is through in-game items in perpetuity. Both models rely on the player or user as the primary customer to drive revenue for the game and game company, as opposed to the business models of, for example, traditional social media in which the advertisers and not users are the primary revenue-generating customers.
Additional Results
Youth disruptive behavior by game
Youth impact by game
Youth spending by game
Adult impact by game
Adult spending by game
Youth Location in-game
Adult Location in-game
Donor Recognition
This work is made possible in part by the generous support of:
The Robert Belfer Family
Craig Newmark Philanthropies
Crown Family Philanthropies
The Harry and Jeanette Weinberg Foundation
Righteous Persons Foundation
Walter and Elise Haas Fund
Modulate
Quadrivium Foundation
The Tepper Foundation