The Storm and the Web: Communication Technology and the Ecumenical Far Right
As seen on January 6th, 2021, once disparate tendencies within the radical right are mixing and collaborating as never before. The very core technological features of the internet and world wide web have played a crucial role in this process of integration.
Washington, DC - January 06: Storming of the Capitol. Image: Tyler Merbler/CC
Observers of the January 6th assault on the Capitol witnessed troubling signs of fraternizing and even cooperation between “ordinary” Trump voters and a variety of extremist groups. This is a significant development that warrants close analysis. It is indicative of larger trends on the radical right, which has integrated and gained currency within the American conservative movement. And while Trump himself has been a lodestone, drawing together once disparate and even antagonistic right-wing tendencies, the personality cult of our one-term president is merely one element in the deepening integration of extreme and mainstream. To be sure, this trend is overdetermined—by local and global economic conditions, electoral polarization, ecological pressures and human migration, and the USA’s centuries-old legacy of white supremacy. However, while these factors may empower the radical right, they do not account for the new style of intermingling and cooperation among its various tendencies. Here, one factor stands out: the role of communication technology. The Internet offers essential technological affordances that enable and even promote this integration.
There are three key technological affordances contributing to this emerging integration: the hyperlink structure of the Web, network dynamics governing digital platform growth, and the role of algorithmic automation in serving content to internet users.
The Web and “The Storm”
Hyperlinking has been essential to the operation of the World Wide Web since its inception at the CERN lab in 1989. Hyperlinks allow any two data points to be connected with trivial ease. This allows Web users to become, in the words of internet visionary Vannevar Bush, “trail blazers” through pathways of information, creating chains of logic and association, which others may follow and learn from. However, these pathways are only as logical or accurate as their creators. It is just as easy to blaze a trail of illogic and disinformation. We now live in an age of argumentum ad hyperlink. Readers rarely click hyperlinks to investigate even the most basic assertions of a headline. In one study, hyperlinks within news stories linked to explicitly mentioned sources less than 50% of the time, and less than one third of all sources were hyperlinked at all. Readers seem to take the mere presence of a hyperlink as adequate support for key claims.
By the same token this structure of hyperlinking is capable of putting disparate political factions side-by-side. Click a link in your natural health Facebook group and you may hop to a QAnon thread. Click a link on the QAnon thread about “The Storm” (QAnon’s imagined day of reckoning) and you may just as easily find an imageboard dedicated to violent insurrection. In the mind of a vulnerable user, these hyperlinked connections imply legitimate affinity between the groups. Conspiracy theories thrive under these conditions, as vague associations and innuendo weave paranoid stories based on a digitally falsified sense of cause-and-effect. The movements which intermingled so freely on January 6th demonstrate precisely this ideological eclecticism and epistemic nihilism.
Power Laws, Network Effects and the Digital Masses
Digital networks do not grow in the same way that offline networks do. Thanks to something called the power law dynamic, a handful of nodes in any digital network tend to receive the lion’s share of traffic and attention. Power law dynamics are not universal to online platforms, but this winner-takes-all arrangement tends to be no less true for extremist channels than for mainstream blogs or online bookstores. Extremists flocked to Twitter, Facebook, YouTube and other platform monopolies for the same reason that brands and would-be influencers did: no other online audience came close in terms of size. Trump, himself a political creature seemingly made for Twitter, was until recently a rare two-time winner of the power law dynamic, a discourse dominating voice on a market-dominating platform. In that role, he took many disparate far right tendencies under his wing. The events of January 6th show how this digitally connected mass could be mobilized.
The power law dynamic is aided by another dynamic known as network effect. As an online network grows, its value to users can increase exponentially. This attracts even more users at an ever-increasing rate, until the platform comes to dominate its niche in the digital ecosystem. This is one reason why, for example, the far right has swarmed specifically to Telegram following a rash of deplatforming on Twitter and Facebook. Telegram already offers a rich network of radical and extremist users, making it a valuable communication resource. As more users flock there, Telegram’s value as a communication tool increases, which further incentivizes potential users to create accounts. It may even be possible that this moment will be a moment of reckoning for Alt-Tech, in which platforms like Parler and Gab could be surpassed by the richer and more robust Telegram network.
Algorithmic Automation and the Power of Suggestion
Automation is the affordance that enables digital media to be altered, disseminated, and even created through the use of templates and algorithms. It makes possible everything from WordPress sites to Instagram filters, your Netflix recommendations and the bot accounts that plague social media. Automation is what enables the Internet to operate at its current scale. It would be simply impossible to manually design—much less program—our present volume of digital content.
But by now the dangers of algorithmic recommendations are well known. Platforms like Facebook are notorious for introducing users to radical and extremist pages. YouTube’s role in providing content “rabbit holes” to extremism is also documented. Even when automated recommendations do not favor outrage, fear, and loathing, their tendency is to aggregate and segregate people of similar persuasions. This encourages radicalization through processes of outbidding, “risky shifting,” and other dynamics related to what Sunstein calls the “Law of Group Polarization.” Hence, extremist tendencies become integrated with one another and isolated from mainstream discourse.
Automation also facilitates the growth of political extremism via the professional gloss it grants extremist media. Automated web design and photo editing enable the fringes to mirror the mainstream. In the pre-digital age, extremist content often came packaged in amateurish design: the photocopied ‘zines of 1980s skinhead culture or the recognizably self-published appearance of militia manuals. Today, these visual cues are increasingly rare, as our neighborhood boutiques use the same web design templates as white supremacist blogs and militia outfitters. This dynamic only helps to further normalize extremism within the fabric of our society.
It is not clear that these affordances can be altered without changing the fundamental operations of the internet beyond recognition. It might be impossible to build an internet that does not lend itself to ecumenical extremism and the dream logic of conspiracy theory. Media literacy training is critical to ameliorating the effects of these seemingly unavoidable technological affordances. So, too, is attitudinal inoculation against the rhetoric of extremist groups. But these alone can’t solve the problem of ecumenical extremism. By now, many applications of digital technology have demonstrated negative impact on cognitive and emotional functioning. These detriments have an impact on the spread of extremist politics, and their integration into mainstream spaces. Of course, countless economic and infrastructural applications rely on cybernetworks to function, and our world is much the better for it. However the human, social dimensions of our world have not always been so improved by their incorporation into digital networks. It may be time for a concerted social movement aimed at reducing elective Internet use. In the same way that public awareness campaigns launched cultural movements against smoking, drunk driving, and sexual harassment, it may be time for a cultural movement to unplug from the radicalizing architecture of the Internet.
- An effective ban on foreign fighting? Wider implications of the Czech policy towards foreign (terrorist) fighters Sep. 20, 2021
- Stable trends in unstable times: Right-wing terrorism and violence in Western Europe in 2020 Sep. 13, 2021
- On the ideological and cultural diversity of current antisemitism Sep. 8, 2021
Welcome to the “RightNow!” blog where you will find commentary, analysis and reflection by C-REX’s researchers and affiliates on topics related to contemporary far right politics, including party politics, subcultural trends, militancy, violence, and terrorism.
“RightNow!” also provides a platform for republishing op-eds by our core team of experts (with due acknowledgement of course) which have been published by newspapers and on other blogs in order to further highlight the breadth of our work here at C-REX. The articles give the views of the authors, not the position of the Centre for Research on Extremism.
To submit proposals and comments, contact the RightNow! editor Iris Beau Segers