If you spend enough time on Twitch, odds are that you might stumble upon someone blasting a white nationalist speech in the background while streaming gameplay of Monster Hunter. Or they’re bringing up the idea that white people are being systematically “replaced.” Or they’re using open-world games like Minecraft and Roblox to roleplay Nazi concentration camps. Or, far more likely, they’re spewing QAnon conspiracies and misogynist talking points, all under the auspices of open debate.
Ciarán O’Connor, a disinformation analyst at the Institute for Strategic Dialogue (ISD), a London-based think tank, has spent the last six months researching how extremist ideologies are shared on TikTok as well as spread via platforms like Twitch, Steam, Discord and DLive. This year, ISD spearheaded a multi-part investigation dubbed “Gamers Who Hate,” digging into the behavior of far-right figures and how much freedom they have on streaming and chat platforms.
Some research has already suggested that gaming environments are ripe for exploitation by the alt-right, but the findings from O’Connor and his peers are a little more nuanced than that. Rather than leveraging a site like Twitch to simply recruit for a group or broadcast explicitly offensive gameplay, O’Connor suggests that far-right users are using streaming platforms to bond and create opportunities for action. It fits neatly into an ecosystem of extremist posters who find ways to get around scrutiny and shutdowns, even if sites like DLive and Discord have policies against hate and violence.
“Before January 6th, DLive was a live-streaming platform that offered alt-right users a chance to monetize their streams. That changed after pressure following January 6th, but these users just regrouped. They’re used to it. There’s a tacit acceptance that they will be removed from a platform at some point,” O’Connor explains. “Extremist groups and communities will continue to mobilize and organize in smaller spaces, smaller platforms and the less mainstream spaces. But they will still continue to try and opportunistically use the larger platforms, too, because these platforms offer a scale that your alt sites cannot.”
I recently spoke to O’Connor about how, and why, streaming platforms are such interesting hubs of organizing and speech, as well as why it’s so easy to stumble into racist and misogynistic takes there.
Functionally speaking, how does a livestream platform differ from previous misinfo and disinfo hubs, like Facebook, or even Twitter?
The central part is that because it’s live, it’s incredibly difficult to catch this stuff in real time. Lengthy broadcasts do, from the perspective of platforms, make it easier because there’s more opportunities to catch [extreme or offensive speech]. But from the extremist point-of-view, live-streaming platforms offer you a chance to circumnavigate potential content moderation and takedowns. It allows you to grow your audience. It allows you to spread ideological material, have clips get shared online elsewhere and bring in more people.
It also comes with the potential to be monetizing your content. Whether it’s live or prerecorded content, video platforms that offer monetization options are very, very appealing for extremist users because it offers them a revenue stream. Even if the platform itself won’t allow your stream to use its monetization tools, you can plug in third-party apps that function in the same way.
When it comes to hateful speech around games, I saw that harassment a lot as a teenager, just while playing online and talking to people. There’s a long arc from the gaming communities of the aughts and how it operates today, with the growth of a streaming culture that has its door open for far more extreme forms of speech. Do you see a relationship there?
The culture that exists in these extremist communities, we’ve seen elements of it in the past. I mean, when I was on MSN Messenger as a teen, there were all manner of irony, transgressive jokes, offensive speech, everything. But what we’ve really seen with the maturation of livestreams and gaming platforms is the ability for this to operate at scale, and for it to move beyond the “garden walls” of gaming in the noughties, for example.
Things like Gamergate have fueled, in part, toxic cultures in comment sections and forums and the like. There’s an awareness amongst extremist users — or anyone who wants to troll, to offend and to use racist terminologies — that there’s a way to conduct raids on whoever their perceived enemies may be. Usually it’s a small number of subversive people who can cause chaos.
So you see communities using something like a Discord server or a Telegram channel as a “staging area,” then moving into the stream of, say, a LGBTQ streamer on Twitch or whomever that perceived enemy might be. With the network these stream platforms are built on, they can move rapidly, raid, cause some damage and move out. So there definitely is a tie to the toxic gaming communities of 10 or 15 years ago, but I do think that the current phenomena amongst these communities is a more recent development.
To that point, how have things specifically changed in the last five years?
There’s been a realization that you can really have influence by banding together a small cell of ideologically similar people — who may have connected originally on 4chan or something like that — and organizing an attack. Gamergate was influential for that, but also how 2016 and the rise of Donald Trump opened the door for chaos and the chipping away of established political discourse. It was like a grenade was thrown into that discourse, not just with toxic offensive comments on Chan boards but with the rise of harmful conspiracies through QAnon and the weaponization of disbelief.
[Extreme actors] are then able to use these less-viewed platforms almost like a staging area to move off of and try and create, in their own words, “a happening.”
Your report highlights the phenomenon of “Omegle redpilling,” and how right-wing extremists like Paul Miller use Twitch to livestream his trolling of strangers with racist, misogynistic and just extreme talking points. Why did this case interest you?
Before I was actually in the weeds on this research regarding Twitch, I’d already seen this kind of live performative racist trolling. I was amused when I first saw it, because I remembered growing up with Omegle and Chatroulette and these sites to talk with strangers. It felt like old news to me [Laughs]. But what gave Omegle a new form of currency for these extremists was the potential to simultaneously broadcast your interaction to a site like Twitch.
The fact that there was an Omegle stream running on a live-streaming platform added another extra dimension to it. It’s what allowed footage to be clipped by Paul Miller’s followers, which they then spread all over BitChute, TikTok and Telegram. So what really gave it a new form of currency, for these types of extremists, was the potential to simultaneously broadcast to a bigger audience. This content really travels, and that’s quite concerning because Paul Miller and [his alias] GypsyCrusader isn’t popular on Twitch now. He’s popular on TikTok. And he didn’t upload those clips.
So this stuff has incredible repost value for communities that are inspired to share offensive material with others that they had no part in themselves.
And then it creates a kind of content cycle, I imagine, that becomes harder to control.
Yeah, it’s supplementary material that furthers this kind of reactionary trolling. So platforms need to be aware that this is now an active subculture among far-right extremists. There is a danger of it bringing young people — or potentially impressionable or vulnerable people — who may see part of the humor in the offensiveness of it, down the garden path to more explicit material that’s actually ideologically or politically minded or motivated.
The report notes, “Support for extreme right-wing ideologies can be discovered on Twitch with relative ease,” but that this content isn’t necessarily indicative of the systemic use of Twitch or other streaming platforms by the extreme right. What do you mean?
What we really were saying with this point was that there were no organized extreme right groups that we could find that were using Twitch, whether for gaming or even just to host conversations. There wasn’t that presence. Some of my colleagues found equivalents on DLive. The U.K. white nationalist group Patriot Analytica or something like that was using gaming and was chatting on top of the stream. We didn’t find that kind of organized ecosystem on Twitch, other than Paul Miller, who is more or less no longer there. So really, it’s not as simple as saying that violent video games inspire would-be extremists or terrorists.
It’s more that communities based around gaming, on Twitch and other platforms like it, create the space for extremist conversations to happen, and for communities to develop and move to perhaps more protected spaces like Telegraph, where more explicit action can take place. Games are a venue for like-minded people to meet and chat about how to win, while also sharing their ultranationalist or extremist fantasies.
And of course, once you’re in that world, radicalization can happen through an informal, casual process, and that’s the conversations based around memes, irony, shitposting or offensive jokes. That’s really at the heart of it. That’s perhaps the greatest threat from far-right communities on gaming platforms — that it’s a new way to reach young and potentially vulnerable people with violent and harmful ideologies.
Is the fact that this is less prevalent on Twitch a testament to anything they’ve done in terms of policy or moderation?
To use TikTok as an example: TikTok has a great bottom-up culture of users flagging conspiracy content and extremist stuff. But from the top-down, they’re a bit lacking, and TikTok users are doing a lot of the work that the company should be doing. But on Twitch, you actually do see the top-down leadership with sweeping action against offenders. So yes, there’s still extremist content, and it’s easily discoverable. But Twitch does deserve some credit for [being proactive].
The flip side is that it’s very hard to catch. It does require constant and active efforts by Twitch to monitor this stuff. And it’s key that when there are extremist figures, perhaps like Paul Miller, that the platforms take sweeping action themselves and remove it at the root level, rather than a video here and there.