This article has an estimated read time of 13 minutes
It took a few months for Cliff to realise that his video game chatroom was being taken over by the far-Right.
Since the spring of 2016 the 27-year-old had been hanging out with a few hundred like-minded people on Discord, a messaging app popular with gamers. Being in the group made it easy to find opponents, set up matches, and shoot the breeze in between. They played Hearts of Iron, an alternate history war game, and Team Fortress 2, a cartoonish competitive shooter.
Things began to change in April 2017, when the group's administrator cut back his duties, creating a power vacuum. Some of his “lieutenants”, who stepped up to fill the gap, appeared to have far-Right sympathies. They were lenient towards users who used racist language, and they had plenty of new friends who were eager to join the group. Far-Right memes, links and insults became more common.
Soon these players formed a distinctive faction. They would gang up in teams against everyone else, even where it made no sense; exploit glitches in the games in order to win; yell continuously on voice chat so that nobody else could hear each other, and campaign for arbitrary and restrictive new game rules which seemed calculated to exclude. Most of all they were pushing a political message, starting arguments and filling the chatroom with racist content.
“It caused a rift,” says Cliff, who lives on the east coast of the USA and who asked for his real name not to be used for fear of retaliation from his former group-mates. “The more far-Right people were accusing [others] of conspiring against them, pushing a narrative that they were trying to break up the group… making it some kind of loyalty test, that you had to only be in this group. And so they just started kicking people out.”
Cliff still doesn’t know whether the far-Right faction had been plotting this from the beginning. But, once the rift got wide enough, it didn't matter. The faction, he says, were bent on “dominance”, packing the group with new members who shared their sympathies. In September they took over completely, and the defeated moderates fled to set up a new group. As far as Cliff knows, the old one still exists, providing a safe space for extremists.
In the wake of the terrorist attack in El Paso, Texas this month, which left 22 people dead and 24 injured, and another deadly shooting in Dayton, Ohio, many American politicians turned their spotlights on violent video games. In a speech on Monday, President Donald Trump himself called for new restrictions on “gruesome and grisly video games” that “celebrate violence”.
For decades, parents have worried that the bloody, amoral mayhem of games such as Doom, Mortal Kombat and Grand Theft Auto might be making their children more aggressive and, in the words of one famous polemic, “teaching our kids to kill”. In the US and the UK, a series of political and legal battles led the games industry, and then the British government, to adopt age rating systems.
Today, the idea of a direct link between video games and heightened aggression looks shaky. An Oxford University study found violent games to be no more damaging than non-violent ones. The American Psychological Association has called the evidence “scant”. One psychologist told the New York Times: “The data on bananas causing suicide is about as conclusive.”
Yet as Cliff’s story shows, there is a more urgent danger lurking in plain sight – not in video games themselves, but in the communities that surround them.
Over the past five years, many of these communities have become incubators of – and gateways into – far-Right extremism. Some are echo chambers filled with propaganda which helps to radicalise susceptible individuals. Some have direct links to recent mass shootings. Others appear to have become recruiting grounds for extremists, who slowly lure young people into their ranks by appealing to their deepest needs and isolating them from their peers.
“Parents are so afraid of paedophiles online,” says Dana Coester, a professor at West Virginia University who researches digital extremism. “They need to be way more concerned about white supremacists online.”
‘The Nazi pipeline’
The Christchurch killer began his live stream at about 1:40pm on Friday, March 15. Over the next half hour he would give the internet a gunman’s-eye-view of his real-life mass murder, much as high-level video game players broadcast their victories and defeats on live video services such as Twitch. Before that, however, as he drove to the first mosque, he played a jaunty Serbian song known to the internet as “Remove Kebab”.
The Remove Kebab song originated as an anti-Muslim anthem during the Yugoslav wars. The internet gave it an afterlife, at first mostly by mocking it. It became particularly popular among players of strategy games such as the medieval Crusader Kings 2 and the Renaissance-set Europa Universalis series, where “removing kebab” became a euphemism for defeating the powerful Ottoman empire. Eventually those games’ developers banned the phrase from their official forums because it had become a rallying cry for racists.
Violent hatred of Muslims and veneration of the Crusades are common markers of far-Right extremism. Though the far Right is fragmented and comes in many forms, it is usually characterised by a belief in the superiority of some races over others, support for violence against non-white people, anti-Semitic conspiracy theories and, sometimes, by denying or even praising the Holocaust.
The killer’s manifesto was full of such ideas, referring to legal immigration as an “invasion” and a “genocide” of white people which could only be met with horrific force. It was also full of other nerdy memes and video game references.
“Yes, Spyro the Dragon 3 taught me ethno-nationalism”, he sarcastically wrote, before clearly stating that he had not been radicalised by video games. Later he urged viewers to “subscribe to PewDiePie”, a popular gaming YouTuber who has often been accused of anti-Semitism and has accordingly been adopted as a meme by the far Right (PewDiePie himself condemned the killing and said he was “sickened” that the killer said his name).
On one level, these inclusions were an obvious act of mockery, perhaps intended to incite discussion and help the killer's ideas spread. On another level, they do show that he was deeply immersed in gaming culture.
And, while there is no reason to think that the majority of gamers are racist, fascist or in any way inclined to commit violence, the Christchurch killer was not alone.
The alleged San Diego mosque shooter referred to his body count as a “high score” and also used the PewDiePipe meme. Andrew Clark, the man arrested for the live-streamed killing of 17-year-old Bianca Devins, mentioned PewDiePie too, and was a member of a Facebook group called “Darkcel Gaming”.
Most famously, the San Diego, Christchurch and El Paso attacks were all preceded by manifestos posted on 8chan, a website that has become notorious as a “cesspit of hate” accused of “directly inspiring tragic events”.
The rise of 8chan is intimately linked to gaming culture. The site was founded in 2013, but it got its big break in 2014 when it became one of the key hubs for a sprawling movement known as Gamergate.
Gamergate was a backlash against feminism and “political correctness” in video games, triggered by accusations of corruption in video game journalism. Although describing itself as a “consumer rights” movement, it spawned organised harassment campaigns against female game developers, including death threats, data leaks and the spreading of nude photos.
At first much of this activity was coordinated on 4chan, another website famous for its anarchic, sometimes vicious trolling culture. But 4chan banned Gamergate discussion, leading 8Chan to take its place.
Gamergate's political coalition was eclectic, incorporating libertarians, classical liberals and self-professed Leftists as well as die-hard conservatives. Nevertheless, it also had a strong strain of hard-Right politics, and many of its central figures later became part of the so-called alt-Right.
In a discussion thread on the white nationalist forum The Right Stuff, in which 74 users explained their pathways into neo-Nazism, the joint first place was taken by 4chan's politics board, which was cited by 14 users. Eight cited Milo Yiannopoulos, a far-Right journalist who played a central role in Gamergate and went on to become one of Donald Trump’s most famous online supporters (he worked for the Daily Telegraph before 2011). Five cited Gamergate itself, and three cited Carl Benjamin, another Gamergate alumnus who made YouTube videos under the name “Sargon of Akkad”.
“I fell into the Gamergate-to-Nazi pipeline,” said one 4chan user. “They shoulda left my vidya alone.”
Steve Bannon, who was Yiannopoulos’s boss at the time and later became an adviser to Donald Trump, also praised Gamergate’s movement-building capacity. “Milo could connect with those kids right away,” he told a journalist. “You can activate that army. They come in through Gamergate or whatever and then get turned onto politics and Trump.”
‘They catch you with community and belonging’
Something similar almost happened to Colin McGinn, a 20-year-old Harvard University student from rural Texas who narrowly avoided being radicalised through a videogame community when he was around 14 years old.
“I didn’t really know how to talk to people,” he says. “I was quiet, I was a white male, I played a lot of videogames, I didn’t talk a lot with my peers. I wasn’t a particularly socially successful person. I liked to play a lot of really intensive strategy games, and this naturally resulted in me joining the sort of communities these are associated with.”
In these spaces, including 4chan and sections of Reddit, there were some people who pushed far-Right ideas – but only under a thick veil of humour. The way he describes their tactics is “entryism”. They started by finding things that fellow gamers were already angry about, such as radical feminists or hardcore Leftists – a common target for lonely young men with more libertarian leanings. Then they broaden those criticisms into wider political arguments, escalating towards claims of overwhelming conspiracy and urgent danger.
“They catch you with this idea of community and belonging,” says McGinn. “Once you get into one of these communities, it over time changes your social norms, so that you’re no longer able to really relate to people in the real world.
“Once you become isolated like this, it becomes easy for you to see your only real friends as those people within the community. And the people who are even deeper into it than you are, this is their life. People are so utterly convinced that this social community is their life that they can’t conceive of things outside of those ideas. That’s the point at which radicalisation really happens…
“Very few people start out committed to some ridiculous idea of German world conquest. But once they do, once they become part of this community, they start changing their personality to fit in, because they don’t know how else they can.”
Caleb Cain, a 27-year-old former believer who now tries to fight extremism on his YouTube channel and on Discord, says games did not play a large role in his radicalisation (although he is a keen gamer). But, he says, gaming communities are fertile for the far Right because of their “nihilism and depression”, as well as their “mischievous” and “cynical” sense of humour.
He says he has seen first hand how far-Right Discord groups “seek out new recruits and attempt to ‘redpill’ them” – a common term on the far-Right which refers to opening people’s eyes in a way they can never go back from. Their tactics, according to Cain, include “love bombing, humour, shaming and peer isolation”: offering a rare rush of consistent support and camaraderie, then sharing far-Right memes, videos, books and short stories. They focus on 4chan, 8chan and communities associated with Second World War strategy games.
“I would say that it’s almost impossible for young people to enter into gaming communities and not encounter this in some way or another,” says Dana Coester, who, in addition to researching extremism also helps parents and teachers intervene with young people at risk of radicalisation. “You actually have to expend a lot of effort to avoid encountering it. It’s not something that you have to seek out. It’s there.”
The word she uses to describe what happens to some young people in these communities is “grooming”.
The racist social club
But, Coester says, that is not actually the most common form of radicalisation. “There is not a cabal of white supremacists sitting in a room planning a recruitment strategy in gaming,” she says. Instead, there is a complex “ecosystem” of discussion, media, peer support and propaganda.
McGinn agrees: “There were recruiters, but I’d say for every one of those there are ten people who were just adding to the discourse. The most dangerous part of this is that it isn’t one party or one group doing this: it’s social groups as a whole who are performing this kind of collective isolation.”
That echoes the work of Kishonna Gray, a social scientist at the University of Illinois in Chicago who has spent hours sitting inside far-Right gaming spaces. In her experience, these groups are more about allowing people who already have extreme views to hang out with each other and receive validation than about proselytising or recruiting.
She is also clear that most gaming communities are “thriving, loving, helping and nurturing communities”, and that the “toxic” political ones are a minority. But they are often those that “speak the loudest”, meaning they make outsize waves.
Often, members of far-Right groups are not even playing violent games. She has listened to them enjoy peaceable titles such as Animal Crossing, a cutesy village life simulator, and The Sims – all while continuing to discuss their poisonous politics.
“[There is] this narrative that they’re isolated and lonely, when in fact they have an amazing community together,” says Gray. “What they are doing is coming together… and it was inside this private group chat where these really disturbing ideologies were fueling and feeding on themselves.”
Members will selectively share news stories from the outside world that confirm their prejudices, or vent about their personal experiences with ethnic minorities. “People just want their feelings to be validated and confirmed,” says Gray. “Somebody might call them a racist at school, but they can come home and play some Playstation or some Apex Legends or some Fortnite and people are like, ‘no man, you’re not racist, they’re the crazy ones’.”
Unlike a journalist, Gray never infiltrates these groups covertly; members always know that she is a researcher. Strangely, they go on regardless. “Usually if there is racist hate happening, it takes a minute for them to realise that I’m actually black,” says Gray. “A lot of them wouldn’t even believe me. I would tell them I’m a black woman and they would just continue along.”
I say he description reminds me of Nintendo voice actor Charles Martinet’s description of Wario and Waluigi, two frequent arch-enemies of Mario, as “just two... evil guys who found each other”. Gray laughs: “Exactly. That’s perfect.”
'No one calls these people out'
One gaming community has been at the centre of all these conflicts is that of Paradox Games: a Swedish developer of deep, complex "grand strategy" games which allow players to orchestrate detailed alternate histories of the Crusades, European colonisation and the Second World War. Hearts of Iron, Europa Universalis and Crusader Kings are all Paradox games.
Paradox fans can glimpse fascinating historical possibilities: a medieval period dominated by China, an Enlightenment that begins in India, or Muslim empires taking over the world. But fans of Hitler and the SS can also play out their fantasies, reversing the outcome of 1945 and building a Reich that spans the globe.
As a result, Paradox games have attracted numerous players who flirt with or outright idolise National Socialism. Luke, a gamer who moderates a Reddit board devoted to mocking Nazi apologists, says many are just "ignorant" but others are genuine Holocaust deniers.
Some players create modifications which skew the game towards Nazism, adding modern-day far-Right leaders to the game, cutting the power of non-white factions and creating elaborate fantasies of Hitler enslaving the Saracens. Paradox sometimes bans the most objectionable add-ons, but finds the line between well-intentioned historical research and unsavoury obsession hard to police.
"This has been a struggle at Paradox for quite some time," says Loïc Fontaine, Paradox's head of community management. "It is one thing to monitor our own communities, and we do... it’s a far more difficult matter to moderate or even to stay alert for this activity when it happens on platforms and in spaces where we can’t be present, or where we don’t have the ability to monitor or moderate proactively.
"We know there will be people who use games to pursue these ends, and it would be naive of us to claim we have no responsibility at all for that, to just bury our heads in the sand, and continue business as usual. That’s why we do act on it, and it has become an increasingly important subject and priority for us in the past few years."
He said that Paradox takes a zero tolerance approach to racism, sexism, hate speech and harassment on its own services, and works with other companies when necessary.
Even so, players say that racial slurs and other toxic behaviour is endemic. "The big problem in the Paradox community is that no one calls these people out, and honestly for good reason," says Ben, a lifelong gamer from Iowa who preferred not to give his last name. "The people that host games are almost always edgy Right-wingers who say the n-word. If you ask them to maybe not say the n-word, they call you a 'libtard' and ban you from the server."
Of the 12 or so competitive Discord groups he has joined, only one actually banned neo-Nazis, and some were actually run by neo-Nazis who banned anyone who criticised their views. Accordingly, many ordinary players choose to stay quiet out of a desire to avoid being frozen out. In the end, Ben explosively quit the scene, posting screenshots of all the foul chat he had observed.
Rumi Khan, another Harvard student who moderates six Reddit boards devoted to Paradox games, tells the story of a mod under development which was slowly taken over by neo-Nazis, who first infiltrated the moderation team of the mod's Reddit board and then began changing the mod to make Asian and African nations weaker. "It was a legit operation," he says. "There was a hidden group of people waging a secret Discord war to turn it into a more Nazi-friendly mod."
But games are not the whole story
There are also reasons to be cautious when assessing gaming communities’ role in fomenting terrorist attacks.
One reason is that it’s hard to assess how much active recruitment is really going on. Everyone I spoke to says that it does happen, but there is little data to show us how common it is. The games journalist Cecilia D'Anastasio argued last year that many claims of widespread recruitment in games communities are not properly backed up, and that there was no evidence it was happening widely.
Another reason is that in 2019, it is more unusual for a young person to never play videogames than for them to be a gamer of some sort. In Britain, 62pc of children play video games where they directly interact with other players. The average British child plays games for 2.1 hours a day. Evidence from the US suggests that the proportion of kids who play video games is above 90pc.
In other words, it isn’t surprising that mass killers would also be video game fans, especially because most of them are male. There is a wide gap between simply playing video games and being a regular of 4chan – between the average child and the most committed, hardcore players. Nevertheless, James Ivory, an academic at Virginia Tech University, warns that saying young killers play games is just like saying they wear shoes.
Kishonna Gray also cautions against regarding gaming communities as the sole or even the primary cause of violence. “I know it’s a sexy story, that people really want there to be this direct pipeline from video games to neo-Nazi groups,” she says. “We are so simple in our thinking that we want it to be nice and clean.”
Instead, she argues, while gaming communities can play a role, terrorism usually has many causes, and white supremacist ideology is deeply rooted in history. Personal events – such as having a family member attacked by a non-white person – and family history – such as receiving racist ideas from one’s parents – are far more influential than any 4chan troll could ever be.
Rather than inducting new people into the far Right, gaming communities may simply help to gather together people who would already be inclined to it. Dylann Roof, who murdered nine black Americans at a church in South Carolina in 2015, claimed to have “never been the same” since searching for the words “black on white crime” into Google. But Gray says: “If he had not had access to Google search, would he have been radicalised? My answer is yes.”
Still, Gray does believe that digital technology increases the speed of white supremacists finding each other. It is easier and faster to throw out a few racist jokes and check whether a given community rejects them or goes along with them. Plus, she says, the internet reduces people’s inhibitions, making them more comfortable expressing racist views in the first place.
It takes a village to make a terrorist
And there is another danger beyond active recruitment – a type of “grooming” that requires no puppet master.
“You cannot minimise the potential damage of long-term exposure to this content, even when it’s background noise, or something you’re scrolling past in a game,” says Coester. “Just because that individual doesn’t take action on that, or doesn’t engage with that, I think we’re only beginning to come to terms with what the long term damage of that is…
“If you’re a gamer, and you don’t have any ideological inclination towards that, but it becomes part of the background noise of your experience, that still has an effect of normalising it over time. And I think that’s dangerous.”
Terrorists without any connection to terror groups are often referred to as “lone wolves”. But many terrorism experts believe this is a misnomer. “Lone wolves” are usually part of extreme communities which may not specifically help them plan their acts, but which nevertheless encourage them and solidify the beliefs that drive them.
A 2013 study of 1,993 “lone-actor” terrorists found that 48pc were interacting in-person with fellow extremists, 35pc were doing so online and 68pc were probably consuming literature and propaganda produced by those movements. In a majority of cases, somebody knew about their plan before they executed it.
The theory is that most people in an extremist space will never actually commit any violence, but that they create the conditions which tip a small number of individuals over the edge. “I have never been this happy,” posted one 8chan user after the Christchurch attack. “I am ready. I want to fight.” Another put it more simply: “Who should I kill?”
Colin McGinn agrees with that. Most people he encountered, he says, were “do-nothings”, living “stagnant” and unhappy lives but not likely to attack anyone directly. Yet those people were egging on the “legitimate psychos”.
“When someone who’s already isolated, already violent, already dangerous, walks into these communities, and they hear people cheering mass shooters, taking these ridiculous ideas of purifying the world with blood and violence, a few of them might actually do it,” he says.
“In any large representative sample you’re going to get a few psychopaths. But the alt-Right community is unique in that it doesn’t really shun these psychopaths. It might pretend to dislike them… but they cheer them on because it’s interesting. And at the end of the day, they kind of agree with them.”
That seems to be what happened to Lane Davis, a young man from Seattle who stabbed his father to death in the summer of 2017 after accusing his parents of being “Leftists” and “paedophiles”. In a long investigation into his life, Buzzfeed News found that he had drifted through various levels of the far-Right media, working briefly for Milo Yiannopoulos and other Gamergate luminaries.
Some people actually rejected him for being too extreme, for taking it too seriously, but others thought he was funny and encouraged him. Few believed he would commit actual murder – until he did.
How to deprogram a fascist
There is no silver bullet for extremist radicalisation. Just as it people fall into it for varied and complex reasons, there is also no one route out.
McGinn says that what helped him escape the downward spiral was making more friends and gaining more social acceptance outside internet communities. He never quite "drank the Kool-aid", but remembers the moment where he realised he was "going down a dark path".
"I read this treatise, some bizarre white supremacist stuff about the inherent dignity of the white family. And I had this weird thing where I was like 'that doesn't sound right at all, but everyone else believes it so I guess I have to believe it to. That's the redpilled way to talk about it.' And then I got home and I thought about it and said: 'Jesus Christ, that's insane!
"I started to deradicalise myself, to realise that the world wasn't as hopeless as I made it out to be. And ultimately these people aren't my friends. They're a bunch of psychopaths." He says far-Right believers are often locked into a "cult of death", believing that the world has no possible place for them.
Almost every person I spoke to for this feature also cited the role of YouTube and its infamous recommendation algorithms, which researchers say systematically direct people from ordinary political content towards extremist content. "Part of the grooming is happening by algorithm," says Coester.
Caleb Cain was primarily radicalised on YouTube, watching more than 12,000 videos between 2015 and 2019. As he clicked through YouTube's recommendations, he felt like he was "chasing uncomfortable truths". Far-Right YouTubers exploited common search terms to make sure their videos popped up at the end of non-political content, making videos about popular films in order to reach movie buffs.
YouTube denies that its recommendation system leads people into extremism. It says that it has data showing the opposite direction of travel is more common, though it has not shared that data publicly. Even so, it has promised to improve the system to reduce traffic to conspiracy theories and partisan content. Such changes, if seriously executed, would go a long way.
Beyond that, Cain warns parents to watch out for their children suddenly becoming very critical of minorities, suddenly becoming very interested in and argumentative about politics, developing a cruel sense of humour and talking incessantly about "white issues". He recommends being attentive to children's mental health and maintaining a close bond of trust.
He has also set up a Discord channel where people who are leaving the far Right or who worry about how far they have fallen can find support and discuss their experiences openly.
And if community is the most powerful weapon the far Right has, it may also be the best way to fight them. Recall Ben, the Paradox Games fan, who lamented that nobody was standing up to the fascists.
After Cliff's Discord group got taken over by the far-Right, he secretly stayed inside it for a while in order to observe what happened next. It became, he says, a whirlpool of hatred: constant derogatory references to Jews, constant racial slurs and constant glorification of Nazis and American Confederates. By the time he finally quit in disgust, the leaders of the new regime had begun to fight among themselves.
But Christchurch, he says, sparked a reckoning within the Paradox community. Many far-Right members were purged and their codewords banned. "Once people realised what was happening, we all decided that it was something that should not continue," he says. "So I think that there is room for having more moderation, and not have it be at the expense of free speech. It's a very fine line to walk."