top of page

“Subscribe to PewdiePie” - How Internet Humour is leveraged to Encourage Fascism & Violence 

  • Ahmad J
  • Jul 29
  • 9 min read

Updated: Jul 31


In 2019, Brenton Tarrant - the Christchurch Killer - slaughtered 51 innocents and injured 89 others at two Mosques in Christchurch, New Zealand on a Friday afternoon. 



Image taken from Tarrant's livestream broadcasted on Facebook and Twitch; bearing a haunting resemblance to first-person shooter games.
Image taken from Tarrant's livestream broadcasted on Facebook and Twitch; bearing a haunting resemblance to first-person shooter games.

 

Using a GoPro camera, Tarrant livestreamed his massacre on Facebook and the streaming service Twitch. Along with the manifesto he published online, it became clear that Tarrant was no “lone wolf”. This was a member of an active online community, one that is drowned in fascist ideologies and Neo-Nazi beliefs.  

 

Tarrant’s manifesto was full of memes and references to in-jokes and ideologies made within these online communities and echo chambers. His livestream itself also featured several memes popular in the racist and Islamophobic parts of the internet; from the ‘kebab killer’ song he played – a reference to the killing of Muslims during the Bosnian war and genocide from which the song emerged (largest genocide in Europe after the holocaust), to the decorations he adorned his weaponry with: references to historic and contemporary instances of violence against Muslims – from Charles Martel at the Battle of Tours in 732, to the Quebec Mosque Killer Alexandre Bisonette’s rampage in 2017.  


Tarrant's gun. His armor, ammunition, and other weaponry all had similar symbolism connected to instances of violence between Muslims and the West. TRTWorld
Tarrant's gun. His armor, ammunition, and other weaponry all had similar symbolism connected to instances of violence between Muslims and the West. TRTWorld

 

In particular, Tarrant references PewDiePie, the world-famous gaming streamer and content creator who had one of the largest followings on YouTube in the world. Just as he exits his vehicle to begin his rampage, Tarrant turns to the camera and says, “Subscribe to PewDiePie.” 

 

Subscribe to PewdiePie 

On the surface, this may seem out of place, yet this is how these extremist online communities work – through inside jokes and subversive ‘dog whistles’ which to the unknowing public, do not ring any alarm bells. However, to those who understand the connoted content of the message, they recognize the dark meaning that is truly being communicated.  

 

ree

The framing of racist memes and genocidal humor as “satire” or “dark comedy” undermines accountability and shifts criticism to a failure of humor appreciation. 

 

For instance, the phrase “subscribe to PewDiePie” achieved cult status as a meme during 2017, where PewDiePie's massive online following was being rivalled for the number 1 spot on YouTube by T-Series: a Bollywood music label. Owing to the size of the Indian diaspora around the world, and the popularity of Bollywood outside the Indian community, it is no surprise that T-Series had amassed such a large following. However, this led to an online war between fans of PewDiePie who sought to launch a harassment campaign (veiled as internet ‘activism’) against T-Series. 

 


This ranged from the making of memes to inspire a movement, to the hacking of other accounts, and even the use of anti-Indian slurs by PewDiePie fans and trolls. This activism on behalf of PewDiePie even emerged into the real world through marches and protests; characterizing T-Series as a massive, Indian, corporate entity stepping over the everyman. 

 

This quickly turned from a fandom phenomenon into a racist campaign, as Indians and Indian culture became the targets of harassment. Resulting in other like-minded (racist) groups jumping on the bandwagon.  

 

ree

Parties like UKIP (the UK Independence Party) saw parallels to their own rhetoric in the PewDiePie vs T-Series movement; issuing messages of support and encouraging their own followers to ‘Subscribe to PewDiePie’ as a way of “championing freedom of speech” and representing the “everyman” against the giant corporation.    

 


According to the official UKIP party account on Twitter/X: “Help protect freedom of speech by sharing and signing this petition against the EU Copyright Directive. Also, don’t forget to subscribe to @pewdiepie on YouTube and keep the corporate @TSeries from the top.” This was quickly followed by another tweet saying: "PATRIOTS SUBSCRIBE TO @PEWDIEPIE."


Though PewDiePie would distance himself from the phrase following Tarrant’s massacre, he would later release a song against T-Series, a “congratulatory diss track” that further encouraged the fan base’s ‘activism’. This track contained both overt and subversive racist connotations. It was filmed before the Christchurch attacks, but only released after the attacks - and after PewDiePie had claimed to distance himself from the meme, and even claimed to want to bring an end to the "Subscribe to PewDiePie" movement. Yet, he released the song, even though it was full of racist connotations.


Still from PewDiePie's Congratulations song; YouTube
Still from PewDiePie's Congratulations song; YouTube

  

The lyrics claim T-Series is "corrupt" and sarcastically “congratulates” them while referencing alleged tax evasion. This perpetuates stereotypical Western views of Indian institutions as inherently dishonest, without nuance or fairness. 


The song also includes generalizations about Indian practices, such as references to caste and poverty, positioned for comic effect or critique without meaningful context. This borders Orientalist framing; a tradition of imperialist propaganda. 


This pattern of using humour to veil racist ideologies has become a pattern not only amongst these online groups, but also amongst the politicians and thought-leaders that try to market themselves to these audiences. By using memes and subversive references, they have plausible deniability against charges of hate speech or incitement to violence. While simultaneously identifying themselves as a leader in the political sphere for these online communities. 

 

Donald Trump himself has repeatedly made overtures to these groups, even encouraging the QAnon conspiracy and its adherents, along with the stolen election conspiracy – resulting in the violence of January 6th.  

 

The “PewDiePipeline” 

Moreover, as these edgy ‘dark humour’ jokes and memes circulate, they desensitize the audience consuming this content to these messages. Vulnerable individuals spending increasing amounts of time within these echo chambers are rapidly desensitized to these ideas; humour and the veneer of innocent joke-making provides a cover of plausible deniability against hate speech.  

 

This is where content creators like PewDiePie – and many many others – are at fault. PewDiePie has repeatedly made headlines for edgy jokes that often go too far. 

 

From using the ‘N’ word during a stream; to paying individuals to hold up signs saying “death to all Jews” as part of a prank; to claiming that he joined ISIS.  

PewDiePie watching two Indian men he paid through the app FIverr to hold up a sign saying "Death to all Jews". He claimed it was a prank to show how some people would go to any lengths for money online. This further heightens the insensitivity as he is likely taking advantage of the poor and desperate to make a point.
PewDiePie watching two Indian men he paid through the app FIverr to hold up a sign saying "Death to all Jews". He claimed it was a prank to show how some people would go to any lengths for money online. This further heightens the insensitivity as he is likely taking advantage of the poor and desperate to make a point.

 

Similarly, the campaign against T-Series was framed in a similar way: a series of harmless jokes, including the two diss tracks, Bitch Lasagna and Congratulations. Yet, the subversive racist elements of these joke-tracks still have the dangerous psychological potential for desensitization.

 

 

Desensitization & Radicalization 

According to many psychologists, the psychological process of radicalization begins with desensitization; a gradual acculturation to ideas of violence and dehumanization. Memes and jokes, particularly those shared within echo chambers with the safeguards of anonymity where users can freely indulge in racist discourses, become the perfect vehicle for desensitization – whether intentional or not. This has very credible potential to desensitize users and draw them into radical ideologies (Wolfowicz, Weisburd & Hasisi, 2023). 

 

ree

This places great significance on the pathways that lead users to these types of echo chambers – most of which exist in fringe areas of the internet, rather than mainstream networks like Facebook (though there has been a growth of fringe communities on Facebook as well which Meta has not taken sufficient action to curtail). 

 

For many children seeking entertainment, famous creators and streamers like PewDiePie become their first stop; often the result of automatic algorithmic sorting rather than deliberate search and selection. This is where the “pipeline” comes in. When creators like PewDiePie make jokes that leverage ‘dark humour’ for entertainment, or when they make references to things that are popular amongst fringe communities, they create a pathway or pipeline between their online spaces and these fringe spaces. 

 

ree

Furthermore, many of these echo chambers provide a sense of community, and through their theories and ideologies, give meaning to a world that is uncertain. This is especially potent for young audiences who come down these pipelines full of questions and uncertainty. Extremist communities provide validation, emotional support, and narrative coherence that mirrors cultic psychology. 

 

Additionally, this pattern of migration can also occur as the result of algorithms and recommendation engines which recognize that the individual engages with related content or content creators. Recommendation engines reinforce confirmation bias, gradually narrowing worldviews until alternative perspectives are seen as hostile. 

 

Irony Poisoning 

From creators like PewDiePie, it may be only a short while before users are exposed to creators that are more racist and ideological and find themselves in echo chambers where these jokes and memes are being made and distributed. Younger minds drawn to gaming and meme cultures often lack critical faculties to decode dog whistles. Irony becomes armor and indoctrination. Irony Poisoning is itself a developing concept in the studies of internet subcultures; it refers to the misunderstanding of memes and the misperception of jokes that have been made ironically (Varis, 2019). However, through repeated sharing online, the ironic meaning behind the joke is lost; and those who engage in it and circulate it do so based on their own interpretation – absent the ironic joke which motivated the meme in the first place. This has led to memes about dark subjects – from rape to the Holocaust – being shared openly and consumed unironically, without the ironic intent that originated it.  

 

It is not surprising then to hear that almost all of the incel killers, supremacist mass shooters, and domestic terrorists have been members of these echo chambers. It is equally not surprising to hear that these killers all engage in these forms of humour and subversive communication.  

 

Elliot Rodgers, better known as the Incel Killer or the Isla Vista Killer
Elliot Rodgers, better known as the Incel Killer or the Isla Vista Killer

Elliot Rodgers, The Incel Killer, was a member of incel forums like PUAHate.

 

As was Alek Minassian, the Toronto Van Killer. 


Brenton Tarrant was a member of 8chan, with his manifesto referencing the notorious ‘/pol’ board. 


The Poway Synagogue Killer – John Earnest – was a chan member – even posting his journey on 8chan in his manifesto; describing how 8chan’s /pol board exposed him to the truth of the world.  


The El Paso Shooter – Patrick Crusius – was a prolific poster on 8chan.  


ree

The Tree of Life Killer – Robert Bowers – interestingly, as opposed to chan-sites, was a member of Gab – the far-right alternative to social media platforms like Twitter/X and Facebook. Perhaps owing to the nature of the platform, his rhetoric was overt and open and was not coded with jokes or dog-whistles. 

 

Setting the killers aside, the audiences and communities that form in these echo chambers also reflect the desensitization towards violence and the lives of other human beings that occurs within these spaces. Tarrant’s stream was met with a surge of responses encouraging his violence, supporting him to gain a “high score”, and even joining in on the jokes and references being made. 

 

This performative violence, especially livestreamed, transforms mass murder into an interactive spectacle.  

  

The aftermath of his attacks saw him become canonized as Saint Tarrant, immortalized as a crusading Templar Knight, and made into an online mythic figure, with memes, video games, and even pornography, celebrating his violence. This violence itself became a meme – something to be copied, adapted, improved, replicated. Killers following Tarrant’s ‘tradition’ would also stream their killings, also add songs (the Poway Syangogue killer, who posted a soundtrack online after he failed to upload his stream). Similarly, Tarrant’s inscriptions on his weaponry have also been meme-fied. These inscriptions act as metonymic talismans. Each name is not just a call to arms—it’s a mythic invocation. 

 

ree

Where to From Here? 

To parents, monitor your children’s streams, even if it appears innocent. 


To everyone else, be wary of the content you consume, as your engagement is being read by recommendation engines and sorting algorithms designed to prioritize engagement. Engaging in mainstream content that contains subversive ‘dark humour’ is often followed by sorting algorithms selecting increasingly fringe forms of this type of content in order to provoke further engagement.  

 

This is how digital-age Surveillance Capitalism works: Algorithms reward engagement, not ethics. The more polarizing the content, the more profitable. Radical creators benefit from virality, while platforms profit from radicalized viewers. Section 230 of the Communications & Decency Act provides a sense of legal immunity for U.S. platforms who face little liability for user-generated content. This legal architecture fuels under-moderation and incentivizes risky content. Lastly, the governing Myth of Neutrality under which these social networks operate claims to platform neutrality in the functions of its algorithms. This is a myth which masks the ideological bias baked into the code—values of techno-libertarianism, deregulation, and free speech absolutism become structural norms. 

 

References 

 

Cuthbertson, A. (2019). PewDiePie and T-Series settle legal fight over ‘racist’ diss track. August 14. The Independent. Retrieved from The Independent 

 

New Zealand Police. (n.d.). 2019 – Operation Deans targeted terrorist attacks, Christchurch, 15 March. NZ Police Museum Exhibition: The place of many brave deeds. Retrieved July 29, 2025, from New Zealand Police website



Varis, P. (2019). On being diagnosed with irony poisoning. 06 March. Diggit Magazine. Retrieved from Diggit Magazine 

 

Wolfowicz, M., Weisburd, D., & Hasisi, B. (2023). Examining the interactive effects of the filter bubble and the echo chamber on radicalization. Journal of Experimental Criminology, 19(1), 119–141. https://doi.org/10.1007/s11292-021-09471-0 

 

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs. 

 

 

  

Comments


Top Stories

Subscribe to Our Newsletter

Thanks for submitting!

Democracy, Disinformation & Disaster

Please contact us with any questions, suggestions or feedback.

democracydisaster@gmail.com

bottom of page