The Alt-Right and Digital Radicalization

This is a paper I wrote for my Political Science class in 2021, slightly adapted for the Web.

The alt-right is an amorphous, slippery beast, forming and reforming again around the definition of a hate group, always dancing around the border but never crossing the line into issuing the direct, formalized orders typically associated with groups like the Ku Klux Klan or ISIS. It prefers instead to remain decentralized and inoculated from the consequences of radicalization, producing lone wolves for whom it takes no responsibility. In this essay, I will examine the systems the alt-right utilizes to produce such "stochastic terrorists" through an analysis of the information presented in Mike Wendling's "Alt-Right: From 4chan to the White House", Luke Munn's "Alt-right pipeline: Individual journeys to extremism online", and Mark Alfano et al.'s "Technological Seduction and Self-Radicalization". I will first outline a brief history of the alt-right and their founders and leaders, followed by an examination of the heuristic systems behind technological seduction, from the perspective of a computer science major who has studied human-computer interaction. I will then elaborate on how the alt-right specifically utilizes these systems to radicalize angry white men and produce stochastic terrorists who pick their own targets, methods, and motives for their atrocities.

While white supremacy as a system has existed for centuries, the specific iteration of neo-Nazism known as the alt-right was first formed in November 2008, shortly after the election of Barack Obama, by Pennsylvanian philosophy professor Paul Gottfried, in the form of a small campus organization called the H.L. Mencken Club (Wendling, 17). Named for notorious antisemite and racist H.L. Mencken, the group coined the phrase "alternative right", and contained future alt-right figureheads Jared Taylor, Peter Brimelow (Wendling, 18), and Gottfried's protégé, well-known neo-Nazi Richard Spencer (Wendling, 20). Spencer would go on to form the intellectual backbone of the alt-right, with his theories of "human biodiversity" espousing scientific racism (Wendling, 22), and calling for a pan-European white ethno-state involving the reconquest of Istanbul from the Turks (Wendling, 25). However, the alt-right remained largely on the fringe of extremist movement, as the H.L. Mencken Club was small, and based primarily in Pennsylvania.

The real shift would come in 2011, when the anime imageboard 4chan, long a haven for neo-Nazis due to its complete anonymity, lack of continuity between messages, and total lack of site moderation, announced the creation of /pol/, a board made specifically for discussing politics that would become a nexus of extremist planning and activity (Wendling, 53). A notable proof-of-concept for the alt-right on 4chan occurred in 2014, when the /pol/ board served as the headquarters for Gamergate. Gamergate was a massive concerted campaign of targeted harassment towards prominent feminists in the video game journalism industry, utilizing bots, dox (the divulging of real-world addresses), and even a bomb threat (Wendling, 66). Gamergate was a testing ground for the rhetoric of the alt-right, and caught the attention of Milo Yiannopoulos and Stephen Bannon, both of Breitbart News, another far-right institution that would go on to be a cornerstone of the alt-right (Wendling, 68). Bannon would even rise to become for eight months a chief advisor to Donald Trump, before resigning in disgrace.

Suffice it to say the alt-right exploded, going on to inspire the Pittsburgh Tree of Life Synagogue shooting (Munn, 12), the Christchurch mosque shooting (Munn, 3), the Charleston church shooting (Wendling, 45), and many other hate crimes. Notably, however, in all three shooting incidents, unlike those committed by a conventional hate group, nobody explicitly ordered the gunmen to perform the act, and nobody was legally culpable, save the shooters themselves. To understand this, we must peel back the curtain of the Internet and observe the gears of social conditioning in action.

Alfano et al. go into detail regarding the heuristic mechanisms behind what they call "technological seduction"; i.e., the manipulation of choice by the presumption of knowledge of what the target thinks and wants, and getting the target to agree with this presumption, specifically by means of technology (Alfano et al. 300). They divide such seduction into two varieties: top-down and bottom-up seductions. Top-down seduction refers to the architecture of a space, physical or digital, augmented in such a way as to manipulate the choice of patrons via "nudging", or encouraging, patrons towards specific choices. This nudging can be morally neutral, as in the case of a supermarket in which the store's physical structure necessitates some items being closer to the cash register than others (Alfano et al. 301); or even benevolent as in the placement of tobacco products at the back of the store or behind the counter, increasing the effort required to reach them (Alfano et al. 302). However, such top-down nudges can also be malevolent, such as the New York Times only providing subscribers the option to cancel their subscriptions by calling during office hours (Alfano et al. 303).

This type of malevolent nudging is especially effective in web design. Alfano et al. take Breitbart News as a case study. Breitbart divides its news into categories, as do most news sites, declaring which topics are subsets of other topics, and which are of importance. However, unlike a traditional news site, at the time of Alfano et al.'s writing Breitbart divided its news into the categories of "Big Government", "Big Journalism", "Big Hollywood", "National Security", "Tech", and "Sports". These categories presuppose that "Big Government" and "National Security" are topics that are especially newsworthy and that they are issues the reader should care about (Alfano et al., 305). As previously mentioned, Breitbart is a cornerstone of the alt-right, and invites viewers to accept their structural design as valid, thereby legitimizing their prejudice as a valid mode of thought.

Perhaps even more dangerous, however, is bottom-up seduction, for which there is no real-life analogue, and which is never morally neutral or benevolent. Bottom-up seduction is what is referred to when people talk about "the algorithm": a neural network that aggregates individual users' preferences, past content consumption, age, gender, location, and other personal information, and uses them to find more content to show the user. In essence, the algorithm decides for the user what they want to see, based on what people like them have chosen to view in the past (Alfano et al., 308). This process bypasses human reasoning and provides viewers with what the algorithm has determined they're most likely to want, aiming to take away their decision-making capacity (Alfano et al., 310). Speaking as a computer-science major who has studied human-computer interaction, this kind of hostile digital architecture violates the heuristic standards for ethical human-use software: it produces opaque programming that aims to use, rather than be used by, humans — typically for the sake of profit.

Occasionally, however, the goal is not profit. The alt-right has learned how to manipulate this system to work in their favour. Munn speaks at length about the "Alt-Right Pipeline", a term referring to a process of gradual, step-by-step radicalization that takes advantage of the bottom-up seduction of sites like Youtube to form parasocial relationships between troubled white men and alt-right thought leaders, such as Jordan Peterson, Ben Shapiro and Richard Spencer (Munn, 5). Potential alt-right recruits do not immediately brandish tiki torches and swastikas — they begin their unwitting radicalization by viewing the works of broadly appealing, mildly conservative creators such as Stefan Molyneaux, who provide, to quote from "How to Radicalize a Normie" by Ian Danskin, "a shot of life advice, with politics as the chaser" (Danskin, 7:50).

After months of watching such a figure, the latter becomes trusted (Munn, 6) as a parasocial relationship forms between the recruit and the creator (who has, after all, been dispensing helpful advice promising to cure their ennui). What these creators will then often do is hold guest interviews and collaborate with other creators who are only slightly further to the right. As the creator is trusted, so that trust is transitively extended to the collaborator, and the recruit's belief system then recalibrates to accommodate this new figure's views and opinions (Munn, 7). As this collaborative chain extends, content platforms' (such as YouTube's) bottom-up seduction algorithms take note and begin to curate the content recommended to the user. The user's feed begins to include content related to that they are currently consuming, eventually to the exclusion of all else. The more a recruit watches, the more they are given (Munn, 6). Furthermore, as these alt-right content creators are constantly featuring one another, the algorithm ultimately makes a strong connection between moderates such as Molyneux and openly white supremacist extremists, no matter how much they disavow each other's views on paper.

Recruits are pushed gently along the pipeline to further and further extreme content, but they are moved by many tiny nudges, rather than a large shove, and progress at their own pace (Munn, 8), eventually transitioning from mainstream platforms to places such as 4chan, or the even more extreme 8chan: they make a seamless transition into hate with little cognitive dissonance (Munn, 8). Munn defines three phases of this transition: normalization, acclimation, and dehumanization (Munn, 8). In the normalization phase, white supremacy is only hinted at in its most plausibly deniable form, veiled beneath many layers of irony and humour, with the expectation that the viewer will not take it seriously, and will find it amusing. PewDiePie, an especially well-known Youtube celebrity, provides this phase of the pipeline for many (Munn, 8), making edgy "jokes", masked as irreverence or satire, that push towards sincere belief, while maintaining plausible deniability through irony (Munn, 9).

The second phase, acclimation, refers to that slow recalibration of the recruit's belief system, where a belief that was previously only slightly beyond acceptability can now be safely contemplated and accepted, much as a mountain climber can become slowly used to low oxygen levels (Munn, 9-10). When fully acclimated, the recruit moves on to the next "red pill", and the process of normalization and acclimation begins anew, moving the recruit yet another step down the pipeline; the recruit now considers their previous beliefs too weak and insufficiently outspoken, typically vilifying them as "alt-lite" (Munn, 10).

The final phase, dehumanization, is the most dangerous one. The recruit has swallowed all the "red pills" they desire and have become a fully-fledged member of the alt-right. They begin to dehumanize groups the alt-right declares to be enemies, using degrading terms such as "a transgender" to refer to a trans person, or referring to non-extremists as "non-player characters" — mindless, generic sheep with no agency. They are asleep, whereas the extremist is now purposeful and awake (Munn, 10-11). This dehumanization destroys the extremist's ability to form meaningful human relationships with non-extremists, ensuring the former never sees their victims as people, only as homogenous groups to be blamed for the world's problems (Munn, 11).

Through a mixture of seduction and parasociality, the victim has self-radicalized, often without alt-right thought leaders even knowing they exist — but because of this parasocial relationship, the extremist has no leader to give them direction, and no clear direction as to what they should do with the hate frothing inside them. Thus, with no other option apparent to their heavily distorted worldview, they take matters into their own hands, becoming "lone wolves" who commit atrocities of their own planning, with their own targets, in the name of a movement that does not know or care about them personally — and crucially, cannot be legally held accountable for the actions of such lone wolves, as they ordered nothing, only encouraged and rewarded extremism. Could a leader emerge to unite this fragmented mess into a cohesive army, as was attempted in Charlottesville? Only time will tell.

Works Cited

  • Wendling, Mike. "Alt-Right: From 4chan to the White House". Fernwood Publishing, 2018.
  • Munn, L. "Alt-Right Pipeline: Individual Journeys to Extremism Online". First Monday, Vol. 24, no. 6, June 2019.
  • Alfano, Mark, et al. "Technological Seduction and Self-Radicalization." Journal of the American Philosophical Association, vol. 4, no. 3, 2018, pp. 298-322.
  • Danskin, Ian. "The Alt-Right Playbook: How to Radicalize a Normie". YouTube, uploaded by Innuendo Studios, 21 October 2019, https://www.youtube.com/watch?v=P55t6eryY3g.

  • Return to Writings