This Content Downloaded From 182.18.206.24 On Thu, 19 Aug 2021 00:46:00 UTC

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

Terrorism Research Initiative

US Extremism on Telegram
Author(s): Samantha Walther and Andrew McCoy
Source: Perspectives on Terrorism , April 2021, Vol. 15, No. 2 (April 2021), pp. 100-124
Published by: Terrorism Research Initiative

Stable URL: https://www.jstor.org/stable/10.2307/27007298

REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/10.2307/27007298?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected].

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

Terrorism Research Initiative is collaborating with JSTOR to digitize, preserve and extend access to
Perspectives on Terrorism

This content downloaded from


182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

US Extremism on Telegram: Fueling Disinformation, Conspiracy


Theories, and Accelerationism
by Samantha Walther and Andrew McCoy

Abstract
Several alternative social media platforms have emerged in response to perceptions that mainstream platforms
are censoring traditional conservative ideologies. However, many of these alternative social media platforms have
evolved to be outlets for hate speech and violent extremism. This study examines hate-based channels on Telegram
from a US perspective. While Telegram has often been studied in relation to ISIS, less is known about its usage
by US extremist users and movements. The authors used OSINT and observational methods on a sample of 125
Telegram channels containing hate speech and violent extremist content from far-right and far-left perspectives.
The authors hypothesized that there would be a greater and growing presence of far-right activity compared to far-
left activity due to current migration trends away from mainstream social media by the far-right. The authors also
sought to observe the presence of disinformation campaigns, conspiracy theories, and accelerationism on Telegram.
This study had four major findings: (1) the findings supported the hypothesis that more channels were host to far-
right dialogues, yet there were several far-left channels present, (2) 64.8% of the channels grew in size over a one-
week period, (3) 47 of the 125 channels were connected to well-known violent extremist movements or hate groups,
and (4) QAnon and the COVID-19 pandemic were the most prominent sources of disinformation and conspiracy
theories on Telegram. The findings of this study highlight that alternative social media platforms are a growing
environment for a range of hateful ideologies and are aiding the spread of disinformation campaigns. This study
concludes with a discussion on future strategies to combat the influence of the Internet on radicalization outcomes.
Keywords: accelerationism, alternative media, conspiracy theories, disinformation, Telegram, US domestic
extremism.

Introduction
Alternative media is theorized as those sources and platforms that challenge traditional media due to perceptions
that the traditional media has become biased, or concealed and distorting the reality of information being
disseminated.[1] Alternative media producers, at the core, seek to give a voice to groups who feel marginalized
in the political landscape.[2] As a result, the alternative media platforms become a “self-perceived corrective”
tool of traditional media and often become biased in nature.[3]
Social media platforms such as BitChute, Gab, Parler, and Telegram were largely created due to grievances felt
by conservative users on mainstream platforms such as Twitter, YouTube, Facebook, and Instagram. A recent
report by Vogels, Perrin & Anderson (2020) found that US Republicans feel at an increasing rate, and more
so than liberals, that large social media companies are censoring political dialogue.[4] The study found that
69% of Republicans feel that these same companies hold a liberal bias.[5] Facebook, Twitter, and Instagram
have also undertaken significant efforts to remove users and content that promotes white nationalism, anti-
Semitism, neo-Nazism, hate groups, or other alt-right ideologies.[6] 
Consequently, conservative social media users have shifted away from mainstream platforms to sites that
promote greater free speech policies and do not enforce extensive content removal guidelines.[7] Many US
conservative activists, politicians, and celebrities have recently endorsed the migration of their supporters to
alternative platforms.[8] For example, a report by Politico found that over the summer of 2020, at least 23 GOP
members of Congress had moved to Parler in protest of Twitter takedown policies.[9] California Congressman
Devin Nunes has been particularly outspoken about Parler and Rumble, often tweeting to his 1.3 million Twitter
followers, encouraging them to move to those spaces.[10] Other influential figures, such as Laura Loomer, who
endorsed her own Telegram channel after having content removed on other mainstream sites, have encouraged

ISSN 2334-3745 100


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

users to do the same.[11]


Research shows that online communities can often produce similar characteristics of offline communities—
they become strong and increasingly supportive of each other’s views.[12] Thus, the spaces become oriented
toward that particular community, sharing information on topics that target the interest of the specific
audience.[13] Alternative social media platforms enable the building of strong and nearly impenetrable virtual
communities that can produce echo chambers of hate speech and violent extremist dialogue that would
otherwise be removed by mainstream platforms. Further, de la Brosse, Holt & Haller (2019) argued that research
on alternative conservative media is becoming increasingly necessary as there is a growing interdependent
relationship between right-wing populist politicians and conservative media sources, which are becoming
more professionalized overall.[14] This professionalism and support by politicians can further legitimize
biased media sources which are then spread to social media and influence the masses. Continued spread and
acceptance of disinformation campaigns, conspiracies, hate, and extremism threaten to de-legitimize and build
distrust of mainstream media and democratic institutions.[15] 
Telegram specifically has been identified as one of the most influential recruitment and planning tools used by
ISIS.[16] However, less is known about Telegram activity and usage by US-based users and movements. Thus,
this article seeks to examine the nature of Telegram as an alternative social media platform and outlet for hate
speech and violent extremism for far-right movements in the US. With a rise in platform migration and far-
right activity globally, the authors hypothesized that far-right activity would be more prominent than far-left
on Telegram. Further, it is hypothesized that those channels dedicated to current political trends, such as the
election or COVID-19, would be actively attracting new followers. The authors also utilized observational
methods to examine the types of communication occurring on the Telegram channels, including disinformation
campaigns, conspiracy theories, and accelerationism. It is hypothesized that these themes would be significant
facilitators of activity on Telegram, considering current extremist trends and political polarization in the United
States.

Literature Review

The Role of the Internet in Radicalization Outcomes


There is scholarly debate over the true effect of the Internet, and specifically social media, on radicalization
outcomes. As stated by Conway (2016), no single article, including the one at hand, has the capacity to
determine the effect of the Internet on violent extremism and terrorism as a whole. However, this current
article attempts to address some of the scholarly concerns about research on the Internet and radicalization
presented by previous studies.[17] Particularly, Conway (2016) and Scrivens (2020) noted that in terms of the
Internet as a facilitator of radicalization, research has to extend beyond the scope of violent jihadist trends and
analyze data across multiple ideologies. Further, researchers must engage in more descriptive and comparative
analysis, as opposed to explanatory research. Before scholars can determine why the Internet is facilitating
radicalization in contemporary terrorism, there needs to be a larger focus on descriptive data that could serve
to inform scholars on what type of role the Internet is playing.[18] Scrivens (2020) also contended that much
of terrorism research tends to focus on individual-level risk factors and there is a need to step outside this trend
to collect more data across different types of frameworks and populations, as well as seek different scopes of
studies. While the question of the Internet being a factor of radicalization does influence online users at the
individual level, the main question of this study is to take a broader look at what parts of Telegram and what
content on Telegram is attractive to a broader audience.
The literature is particularly thin when it comes to engaging with former right-wing extremists’ firsthand
accounts regarding their radicalization pathways and processes. However, some recent findings have shed light
on how former extremists feel about the Internet as a factor in their radicalization process. Research by Koehler
(2014) found during in-depth interviews with former right-wing extremists in Germany that they personally

ISSN 2334-3745 101


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

felt that the role of the Internet was the single most important factor in their radicalization process.[19] These
firsthand insights are important as they allow researchers to gain an introspective analysis that can only be
understood through someone who has radicalized. Similar findings were repeated in a study by Sieckelinck et
al. (2019) who interviewed former extremists from Denmark and the Netherlands. The subjects of this study
also highlighted the importance of the mass exposure to online propaganda in their radicalization process.[20]
Gaudette, Scrivens & Venkatesh (2020) interviewed 10 former right-wing extremists, during which the majority
of participants heavily pointed to the important role of the Internet when radicalizing toward violence. They
largely acknowledged that the Internet allowed constant and mass exposure to violent extremist content at any
time and allowed them to create a community of like-minded individuals.[21] With this type of information in
mind, it is important to look more analytically at specific online communities to further understand how each
platform can uniquely drive engagement and influence radicalization.
Research across different social media platforms outside of Twitter and Facebook is key to understanding
the comparative differences between how different platforms are used.[22] The current article seeks to add
to the foundation of understanding how smaller social media platforms may be playing a role in far-right
radicalization. Specifically, Telegram can be extremely informational and resourceful for researchers who seek
to collect big data, as well as conduct more in-depth analyses of extreme online behavior. By engaging on a
smaller platform, the current study also could encourage future research to utilize Telegram, as it is a relatively
user-friendly platform with relevant data access.[23] The far-right, in the United States and abroad, is rising
in prominence in contemporary politics and everyday society. Factor the rise of the far-right with its long
history of Internet usage dating back to the early 1990s when Stormfront was created,[24] it is worthwhile to
understand how this particular set of ideologies utilizes the Internet as a modern tool. While this article will
not single-handedly explain all the factors as to why specific aspects of Telegram are able to radicalize or engage
individuals in extremist content, it can add to the literature about the basic nature of user online experiences on
Telegram. By understanding the type of content that is most engaging for users within the far-right community,
scholars can begin to understand the foundation of how the Internet is a factor in contemporary radicalization.

Telegram’s Role as Alternative Social Media


Initially developed in 2013, Telegram is a cloud-based messaging application accessible by computer, tablet, and
mobile device. Telegram offers several options for engagement, including private one-on-one conversations,
group chats, and both private and public channels controlled by admins. Telegram does not partake in
extensive content takedown policies compared to apps like Facebook, Instagram, and Twitter, only removing
pornographic material and some violent rhetoric on its public channels.[25] Telegram creators boast about the
multiplicity of security measures taken to protect user data and conversations, including several encryption
software and offers of secret-chats, which automatically self-destruct all content.[26] While the company has
maintained a strict stance to not share secret-chat information with any third parties, including government
and police officials, it did change its terms of service in 2018.[27] The terms of service update stated that if
served with a court order that a user was in fact a terror suspect, the company may release IP addresses and
phone numbers.[28]
Mazzoni (2019) determined that there are three main categories of Telegram channels: image channels, news
channels, and discussion channels.[29] Image channels often feature video propaganda, memes, and infographics
to generally inspire viewers.[30] News channels may often be closed for public discussion, but allow admins
to provide real-time updates on current events by providing links to other sources and captions that fit their
narrative.[31] The discussion channels are open groups where all can participate in open conversation.[32] The
type of channels being utilized is important data to collect to understand how information and hate spread and
influence users. Similarly, there is also a range of user types on Telegram. There are many online users who are
simply seeking information and passively engaging with the content on the channels.[33] Then, there are two
more actively involved groups: 1) those who want to engage more fully with militant and violent groups and the
other users on the channels and 2) propagandists who are both seeking information and engaging with other
users on channels.[34]

ISSN 2334-3745 102


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

The privacy guidelines continue to attract extreme users across a range of hate-based and violent extremist
ideologies that are otherwise being banned from mainstream platforms.[35] Much like the migration of
conservative Internet users in the United States to alternative social media, ISIS operatives initiated a strategic
shift to less public-facing online spaces to conduct information dissemination—and Telegram became the
standard.[36] Research by Shehabat, Mites & Alzoubi (2017) found that increased Telegram communications
by ISIS, particularly by spreading propaganda, played a role in an uptick in lone-actor-inspired attacks in
Europe between 2015 and 2016.[37] ISIS’ ‘virtual entrepreneurs’ were also responsible for “directing 19 out
of the 38 IS-related attacks in Western Europe from 2014 to 2016” by hosting conversations with recruits on
alternative social media including Telegram.[38]
Far-right movements have also expanded across Telegram channels, notably spiking after the 2019 Christchurch
Attack in New Zealand.[39] Research by the SITE Intelligence Group found that following the Christchurch
shootings, far-right channels experienced a 117% increase in membership by October.[40] Further, of the 374
far-right channels identified by SITE Intelligence Group, 80% of them had been formed after the Christchurch
attack. Unlike ISIS channels, the far-right channels have been largely public facing, granting access to any
users seeking hate-based and violent extremist conversations.[41] Also, unlike the highly organized ISIS
networks that have both an online and offline presence, white supremacist networks online are often much
more loosely connected and highly decentralized networks.[42] Further, Guhl & Davey (2020) analyzed 208
white supremacist Telegram channels and found that the platform was largely being used to glorify terrorist
and past lone-actors, call for violence, spread white supremacist material, and degrade minority groups.[43]
In addition to the security protections being particularly attractive, other research has contended that Telegram
has some unique and inherently addictive qualities as well.[44] Telegram chats appeal to individuals who
want to engage with like-minded individuals. Research on ISIS-Telegram channels found that recruiters and
propagandists can function similarly to a “seller of a product,” but in this case the product being a photo, video,
link, or other propaganda.[45] With links expiring quickly and channels filling up quickly with discussion,
users have to be present and logged on to not miss content or the opportunity to engage with others.[46]
Engagement with the content and the group can lead to feelings of fulfillment or being part of a community,
influencing the likelihood that vulnerable viewers will keep coming back.[47]
Holbrook (2019) also found that there are different types of material and differences in severity of extremism
in terms of media material being disseminated online.[48] Holbrook analyzed the types of media material
related to 50 individuals and groups connected to plans or acts of terrorism in the UK between 2004 and 2017.
He concluded that the majority of content was ideological in nature, such as religious materials, as opposed to
facilitative, such as instructions on how to build an Improvised Explosive Device (IED).[49] While his study
analyzed Islamist-inspired extremists and did not specify any specific platforms that the media content was
found on, it highlights the importance of understanding the different types of material online and what is most
influential. Holbrook defined the material as moderate, fringe, and extreme, suggesting a scaled and a nuanced
ecosystem of extreme information being distributed. Further, as suggested by Guhl & Davey (2020), there is
evidence of a shift in the organizational paradigm, whereby online connection to even loose extreme-right
culture and ideology can be equally influential and inspirational for violence as on-the-ground operations and
groups.[50] While most online users may be passively engaging with this content and will never act off-line,
increased exposure to pro-terrorist channels and the regular calls for violence made to these large audiences
increases the risk that one or more of the viewers may eventually act off-line.[51] With rates of engagement
increasing on Telegram, it is timely to understand how Telegram is being used as a tool for furthering hate,
radicalization, and potentially influencing off-line violence. 

Far-Right and Far-Left Online Communities


The theoretical framework on alternative media suggests that it aims to serve a specific community while at
the same time forming an antagonist relationship with traditional media sources.[52] Online communities are
formed and strengthened through shared opinions, creating a collective identity that is often strengthened in

ISSN 2334-3745 103


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

the face of intergroup conflict.[53] Online hate is typically targeted toward different social groups.[54] There
have been both formal and informal hate groups disseminating hateful speech or ideology online with a variety
of targets.[55] Far-right communities largely derive from shared ideals about national identity that justify white
privilege and racism.[56] Far-left communities derive from a range of ideals, including opposition to capitalism,
imperialism, and colonialism, or support for decentralized forms of government, including anarchy.[57] Far-
left ideologies have also included animal rights and environmental extremism.[58]
When discussing far-right and far-left extremist movements, it should be noted that these do not align with
traditional scopes of political parties but comprise a fraction of individuals that espouse extreme and sometimes
violent beliefs. For the purposes of this article, the authors used the terms “far-right” and “far-left” to identify
those individuals and groups who are espousing ideologies that justify hate speech and violence. The terms
“conservative” and “traditionally liberal” or “liberal” are used when referring to mainstream political parties
that are not extreme or violent in their nature. We often differentiate between hate speech and violent extremist
content in the results section, as the two terms imply different levels of extreme thought and speech. While hate
speech is pejorative and discriminatory in its manner, it does not justify violence in the same way that violent
extremist speech does.[59] Hate speech does not reach the threshold of inciting hostility and violence, while
violent extremist speech does.[60]
Both far-right and far-left extremists are similar to one another in that they often hold a simplistic “black-
and-white” perception of the world, are overconfident in their judgments, and are less tolerant of out-group
opinions.[61] While they are based on similar psychological frameworks, far-right and far-left movements may
diverge in the spaces they choose to be active online. Research by Freelon, Marwick & Kreiss (2020) found that
popular social media platforms such as Twitter and Instagram are most often consumed by hashtag activism
movements by traditional liberal activists.[62] Traditional liberal activists are also more likely to take to the
streets in protest for very public displays of off-line activity. Conversely, conservative users increasingly feel that
their beliefs and rights to free speech are being interfered with by content removal policies on popular platforms.
[63] Conservative users feel platforms like Twitter and Instagram remove conservative-leaning content at
greater rates than liberal-leaning content.[64] Freelon, Marwick, and Kreiss (2020) also found that conservative
activists and the far-right are more likely to manipulate traditional media and migrate to alternative platforms,
as well as work with partisan media sources to spread their message.[65] Further Krzysztof Wasilewski (2019)
argued that far-right media manipulates traditional liberal language to form an “ethnically exclusive collective
memory” that presents a version of history that counters the current social-political state.[66] This collective
memory is “exclusive for white Americans leaning toward the alt-right or far-right movement.”[67]
Although many alternative platforms have small followings, some have had a larger influence, reach, and
circulation.[68] While public-facing platforms like Twitter attract and allow individuals to counter hateful and
violent rhetoric, the same pattern may not hold true for alternative platforms with more privatized channels
of dialogue. Alternative platforms are not as public facing, popular, or visible to the majority of Americans. In
regard to Telegram, a recent study by Urman & Katz (2020) found that Telegram activity began to proliferate
with mass bans of far-right actors on other mainstream social media platforms.[69] This trend isolates the far-
right dialogue from the rest of the political discourse, forming an asymmetric polarization online.[70] These
trends set the framework for the first hypothesis of this article. As conservative activists ‘social media users
traditionally shift to alternative social media more so than liberal users, the authors of this study hypothesized
that: 
H1: Due to current grievances, the Telegram channels in this study would largely stem from far-right extremism.
While there may be evidence of far-left channels, they may be much harder to access than far-right spaces or may
be more prevalent on traditional social media platforms, which is outside the scope of this study. 

Disinformation and Conspiracies 


Previous studies have also found that the far-right uses disinformation campaigns and conspiracy theories
more than the far-left.[71] Social media platforms are used to react to real-world events in a multitude of ways.

ISSN 2334-3745 104


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Social media can help mobilize support and assistance in the aftermath of tragedy, share truthful information,
as well as serve as a platform for the “socially disruptive” to spread misinformation and antagonist commentary.
[72] Research on media manipulation by Marwick & Lewis (2017) contended that the far-right has become apt
at exploiting vulnerable young men who have an inclination for rebellion and dislike of political correctness
in order to spread violent ideologies, create distrust of legitimate media sources, and further help to radicalize
individuals.[73] Large and influential Telegram channels can quickly spread false information to hundreds if
not thousands of vulnerable followers, thereby increasing the influence that disinformation and conspiracies
have on online users.
Conspiracy theories are often developed in response to major events in order to allow people to better accept
the unpredictability of these events.[74] Furthermore, conspiracies are more likely to circulate in reaction to
government mistrust, weakening institutions, and when groups feel they are excluded from the democratic-
process.[75] Presently, the United States and the global community are facing two of the most dangerous and
adaptive conspiracy sources and disinformation campaigns in history: the QAnon conspiracy and another
referring to the origins and nature of COVID-19. In the wake of these two societal factors, the authors’ second
hypothesis in this study regarded the content within the Telegram channels, specifically:
H2: The public-facing Telegram channels the authors could access in this study would not necessarily be used for
planning future attacks or specific violent encounters, but would be filled with general dialogue of disinformation
and conspiracies, especially considering the current environment in the US. With the 2020 Presidential Election,
COVID-19, grievances, real or perceived, over police brutality, gun rights, and race at the forefront of the national
conversion, tensions are high, and individuals will use disinformation and conspiracies to not only cope with but
also to further their indoctrinated beliefs. 

A New “Accelerationism”
The term accelerationism is largely understood as a fringe philosophy relating to Marxist views on capitalism.
[76] It is meant to suggest that intensification of capitalism will eventually lead to its collapse. However, the use
of the term, being so recently adapted by members of the far-right, has not been studied in depth. In the far-
right movement, accelerationism has come to mean that the far-right must increase civil disorder, or accelerate
violence and aggression, in order to create further polarization and bring about a societal collapse that fits their
agenda.[77] Several high-profile white supremacist perpetrators have used the term in their manifestos and
other writings, including Brenton Tarrant, John Earnest, and neo-Nazi James Mason.[78] Accelerationists see
violence as a chain reaction that can ignite, or “fan the fire,” creating chaos, collapse, and revolutionary change
that promotes white power.[79] 
Telegram recently came under scrutiny as Black Lives Matter protests erupted throughout the United States
in the summer of 2020. The Department of Homeland Security (DHS) warned intelligence officials of a white
supremacist Telegram channel that was encouraging its followers to infiltrate and start a “boogaloo” at the
protests.[80] Analysis of far-right social media activity reveals that many white supremacists also believe a
civil war is inevitable and that individuals should train, arm themselves, and prepare to incite violence.[81]
Accelerationism has also expanded out of the COVID-19 pandemic, with some white supremacists accusing
Jews and migrants of creating and spreading the virus.[82] 
H3: The most extreme users would argue the effectiveness of accelerationist tactics. The authors strove to discern
which movements are most embracing of accelerationism.

ISSN 2334-3745 105


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Methodology

Channel Access
Accessing the virtual spaces which host hate speech and violent extremism on Telegram can initially be
challenging. Semenzin & Bainotti (2019) developed two methodological approaches to access channels when
studying Telegram: a cross-platform (external) approach and an in-platform (internal) approach.[83] The
cross-platform approach uses links from other sites and blogs to enter into Telegram spaces, while the in-
platform approach relies on creating a list of topic-related words based on previous literature.[84] Once the
list of access words is created, researchers simply use the search bar on Telegram to locate channels relating to
the topic and then can use a snowball sampling logic to locate more similar and connected channels.[85] The
authors employed the in-platform approach in this study, creating a list of 59 initial keywords. The authors
applied the keywords related to far-right and far-left ideologies and movements, as well as focusing on general
topics that are popular in the US discourse that could evoke extreme views. These general topics included
terms such as “coronavirus,” relevant conspiracy theories such as “QAnon,” racial and religious terminology,
and alternative media sites previously studied within the field. In total, 34 of the words related to the far-right,
9 words related to the far-left, and 16 words related to general topics were applied. The full list of access words
is available in Appendix A. 
Initial search results using the keywords returned 79 channels across a range of extreme ideologies. The authors
then extended the database of channels through a snowball technique. Since white supremacist Telegram
channels have been found to be highly networked with much of the content being shared from one channel to
another, they easily link researchers to more and more channels.[86] The snowball technique gained access to
another 46 channels for a total of 125 Telegram channels analyzed in the final data set. The scope of this project
was to look at channels relevant to current extremism trends in the United States. The authors determined
channels were related to domestic extremism and hate-based ideologies on the basis of a number of factors. 
Firstly, channels were included if their names were associated with US-based movements, even if they have
grown to have a global presence. For example, several channels contained “Atomwaffen,” “Proud Boys,” and
“QAnon” in their channel names. While movements such as Atomwaffen and QAnon have gained global
traction, they originated in the United States and the Telegram audience was largely US-based. 
Secondly, many channels that did not directly name extremist movements in their titles did so in their channel
bios or profile images. For example, a channel titled “The Great Awakening” wrote in their bio “Safe place for
all things Q and Trump related,” with several American flags. Their profile picture also depicted the “Where
We Go One, We Go All” phrase with the letter “Q,” to represent affiliation with the QAnon conspiracy theory
movement. 
Thirdly, some channels were connected to movements that did not originate in the United States, such as
the racist Skinhead movement. However, channels relating to international movements were included if they
hosted conversations or shared symbols relating to the movement’s factions in the United States. For example,
one racist Skinhead channel included several memes depicting the US confederate flag, among other American-
oriented images. 
Lastly, some channels did not elucidate their affiliation to US movements and ideologies so clearly, but that
could be determined based on simple analysis of the recent content in the channels. For example, a group
called “Great Kingdom of Zombe” was found through snowball techniques but had no clear affiliation to any
movements based on its profile description. However, the most recent conversations suggested that the channel
was aimed at discussing the Proud Boy movement and was connected with several other Proud Boy-affiliated
channels. Several channels were found in this manner, as they had ambiguous titles and bios. However, the
channel was clearly dedicated to share hate-based beliefs based on recent dialogue. 

ISSN 2334-3745 106


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Descriptive Data Collection 


Using open-source intelligence readily available on the Telegram channels, the authors collected a range of
descriptive data from each channel. The channel name, channel function (news, image, or discussion), and
follower size were recorded at the time the channel was first located. One of the hypotheses of this study was
that Telegram channels would be growing in audience and engagement due to migration away from popular
platforms. The authors also recorded the follower size seven days after locating each channel. Although this
is a small period, it is useful to gain a preliminary understanding of which channels may be attractive to new
followers and understand which channels are producing consistent content, as opposed to channels that may
be more dormant.
Secondly, the authors examined each channel for underlying ideologies based on channel names, bio
information, as well as recent content. For the purposes of this article, the authors considered ideologies to
be the simplistic belief system that each Telegram channel was predominately embracing. The authors broke
down the far-right and far-left to better capture the presence of unique belief systems. In total, 13 categories
were created: Alt-Right; Anti-Fascist/Far-Left/Anarchist; Anti-Government; Black Nationalist; Cybercrime; Eco-
Fascist; General Far-Right/Hate; Glorifying Violent Extremism; Neo-Confederate; Neo-Nazi; Racist Skinhead;
White Nationalist; and White Supremacist.
Lastly, the authors examined each Telegram channel to determine if they were in connection with specific
hate-based or violent extremist movements that are currently operating in the United States. The authors
differentiated between ideology and movement affiliation, as the former gives a basic understanding of content
on channels, while the latter elucidates in more specificity which known movement may be operating on
alternative platforms. While each channel in the data set had ideologies that were driving the conversations,
not every channel was connected to any specific movement. Many channels were simply host to general
ideologically based hate without propagating the agenda of an established movement. While the QAnon
conspiracy following and the “Boogaloo boi’s” have not been officially named violent extremist groups, their
ideologies have been driving the creation of violent extremist propaganda and raising security concerns in
recent months. Therefore, QAnon and Boogaloo were both included as respective categories. Other movements
found were Atomwaffen, Proud Boys, Antifa, Patriot Front, Skinhead Nation, National Alliance, League of the
South, National Vanguard, New Black Panther Party, the NSDAP, Pine Tree Party, Sparrows Crew, the Base, the
Right Stuff, and Vinlanders Club. 

Content Observations
In addition to the quantitative descriptive statistics collected, the authors also observed recurring topics
and themes in the Telegram channels. While the authors hypothesized that there would be disinformation
and conspiracy theories amidst the Telegram channels, such observations were intended to elucidate which
particular conspiracies are currently most influential. In regard to observational research methods, the authors
used a full-observer approach, meaning they did not interact with any users or discussions in the Telegram
channels in order to not manipulate the subjects and topics of conversation.[87] This approach negates the
possibility of reactivity, which could skew our perceptions of user beliefs and language used (i.e., how extreme
they are).[88] While there are certainly hundreds of topics of conversations across the channels, the results
were focused on the key themes that were most often recurring.

Results

Channel Type: Discussion, News, and Image


Each channel in the data set was categorized by its function either as a discussion channel, image channel, or
news channel. While most channels in the data set were defined as one of the three categories, 20 (16%) of the

ISSN 2334-3745 107


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

channels did display characteristics of multiple functions. These channels were characterized as combination
channels, which would share both large quantities of news articles and original content, including images and
videos. Image channels made up a slightly larger proportion of the data set, with 24% (30 channels) being solely
image and video messages. The image channels were hubs for original content largely in the forms of memes.
Discussion channels made up 15.2% (19 channels) of the data set. Discussion channels were those that were
filled with original content by several subscribers, as opposed to admins, creating original dialogues about
politics, current events, and their ideologies.
Figure 1: Pie chart representing the breakdown of channels by functionality type.

Finally, the majority (44%) of the channels were news based. However, the news that was being shared ranged
in its validity. Some channels would repost articles from actual news sources, while others, especially in
relation to the COVID-19 pandemic and QAnon, would post news links to unreliable and unchecked sources.
This elucidates a strong connection between alternative news media and alternative social media. The social
media channels would echo disinformation from alternative new sources, revealing active participation in
disinformation campaigns and conspiracy theories. Figure 1 shows the breakdown of channels by functionality.

Channel Size
The channel size ranged throughout the data set. One channel was not able to be measured since it was a bot
channel and did not have subscribers. The minimum number of subscribers was 2 and the maximum was
22,368. However, these two counts can be considered extremes, as the average number of subscribers was
1664.79 and the median number of subscribers was 473.5.
Only 4 (3.2%) of the channels had large followings of more than 10,000 subscribers. Each channel was
news-based or news/image-based with no chat option available for subscribers. These channels shared news
relating to their specific ideology or beliefs, oftentimes spreading conspiracy theories or misinformation
from unreliable or unchecked sources. One of these channels was a QAnon-driven channel, which provided
updates to followers with every single piece of QAnon content that gets posted to the official QAlerts website.
The QAnon channel further supports evidence that far-right movements continue to engage in large-scale
conspiracy theories and disinformation campaigns. Figure 2 provides a breakdown of all the channels in the
data set based on subscriber count.

ISSN 2334-3745 108


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Channel Size Change


Due to recent migration trends of the far-right leaving popular social media platforms, the authors hypothesized
that some channels with particularly relevant ideologies, such as QAnon, may be actively engaging new followers
even over short periods of time. In total, 81 of the 125 channels gained followers during a one-week period
(see Figure 3). Twenty-one channels saw decreases in subscriber counts while 22 remained the same size.
Changes in subscriber count, both positively and negatively, were not necessarily drastic, with the exception
of one QAnon channel. The average size change was 23.11 followers, with some channels only gaining one or
two subscribers over the week. Similarly, channels that decreased in size were not decreasing by significant
numbers either. One QAnon channel was an outlier, gaining 566 followers in just one week—a much larger
change than the 21.11 average gain. This pushed the subscriber count to more than 15K globally for that specific
channel. For comparison, the next highest gain was 186 followers over the course of the week. A channel which
was categorized as a cybercrime channel for its abundance of financial scams lost 102 followers—the largest
size decrease in the data set. Another pattern worth mentioning was that all seven of the Proud Boys–affiliated
channels increased membership over the course of one week. While this time period is small and may not be
representative of large fluctuation patterns, it suggests that channels related to the most trending movements,
such as QAnon and Proud Boys, may garner interest quickly. This may especially hold true as media coverage
has increased drastically when reporting on Proud Boys and QAnon. With more coverage and more spotlight
on these lines of thinking, more online users may seek information relating to the movements. Future studies
should consider a more profound analysis into channel growth over longer periods of time, as well as seek to
analyze which ideologies remain prominent and which ones have the longest lifespans on Telegram channels.
Figure 2: Chart showing the percentage breakdown of the Telegram channel sizes, based on number of
subscribers (subs).

Channel Size
10k+ Subs bot
3.2% 0.8%
5,001-10k Subs
10k+
4% 5,001-10k 3.2%
0-50
0-50 Subs Subs
4% 13.6%
13.6%

51-100 Subs
51-100
11.2%Subs
11.2%

125
1001-5000 Subs
1,001-5,000
31.2%
Subs
31.2%

101-500 Subs
101-500 Subs
26.4%
501-1,000 Subs
501-1000 Subs 26.4%
9.6%
9.6%

ISSN 2334-3745 109


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Figure 3: Chart showing how channel subscriber counts fluctuated over a seven-day period.

Ideology
The majority of the 125 channels in the dataset were host to far-right content as opposed to far-left. While
it is important to acknowledge that the far-right may be more prevalent on Telegram than the far-left, the
more important takeaway is understanding which ideologies are currently popular in the US discourse and on
alternative social media.
Alt-Right, Neo-Nazi, and White Nationalist ideologies accounted for more than half of the channels with a
combined 69 (55.2%) of the channels in the data set. White supremacist, neo-confederate, and racist Skinhead
channels were also present; however, at a much smaller percentage, amounting to 22 channels (17.6%) of the
data set. Another eight channels (6.4%) were considered to host general far-right/hate content.
It was common to see a combination of ideologies and symbology on the Telegram channels. Several channels
had many neo-Nazi symbols present, but also called for a separate nation for the White race, leaning toward
a White separatist ideology. Further, it was common to see channels reposting from other far-right channels
even if they did not necessarily share exactly the same ideologies. Popular propaganda pieces and original
content would often be posted across many of the channels. The same trend occurred with many news articles.
The same stories, images, and memes would continuously appear across many of the channels. Therefore, many
channels fed off one another, despite the far-right being made up of a vast range of movements and ideologies.
Future studies should consider examining the social movement of Telegram channels further.
Far-left movements, specifically those with anti-fascist ideologies, are a major topic of discussion with the
current protests taking shape across the United States. These channels accounted for seven (5.6%) of the 125
channels in the data set. While there were certainly not as many easily accessible far-left channels, this finding
does not simply imply that the far-left does not represent a violent extremist threat. This finding is in line with
previous literature suggesting that the far-left may be less likely to utilize alternative social media platforms
for recruiting but is more likely to participate in hashtag activism on popular platforms or off-line activities.

ISSN 2334-3745 110


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

The lack of finding far-left channels is an important finding in the sense that it supports claims that alternative
platforms still largely remain hubs for far-right activity and that far-left activity takes place elsewhere. Figure 4
shows the specific breakdown of different ideologies that defined each of the Telegram channels.
The analysis also revealed evidence of a few ideologies that are often under-studied. One important finding was
the discovery of a channel that fell under the category of Black Nationalism. While the channel was not very
active, it does suggest that there is a diverse range of ideologies on Telegram. With heightened race tensions
in the United States, it is also important to consider how Black Nationalist movements may be escalating and
reacting to the rise of far-right movements.
Three of the channels in the data set were not given traditional ideologies but were rather labeled as “cybercrime”
channels. These channels were originally found through the terminology search because of their titles, which
included violent extremist terms. While they did not show significant evidence of violent extremist discussions,
they did all have a range of financial scams under the guise of a channel name that was connected to other
violent extremist channels. This finding also points to the dangers of alternative platforms that are not as public
facing, which can provide a haven for a range of criminal online activity.
Two channels were related to a rising eco-fascist movement known as the Pine Tree Party. The rise of the
Pine Tree Party and its influence on Telegram suggests a recent revival in eco-fascist thought. The first public
surfacing of the Pine Tree Party was through an Instagram post by Mike Ma on November 3, 2017.[89] Ma,
a potential leader of the Pine Tree Party, has stated in previous interviews that his ideal form of government
would be “no government, but obviously that would require a way smaller population”.[90] Eco-fascism rests
on the idea that in order to restore a higher natural order, the human population must be reduced dramatically,
and this would ideally mean removing immigrants to form a higher population of White people who can
then live in solitude and produce more eco-sustainable patterns of life.[91] Eco-fascism has historically been a
large driver of violence, inspiring several recent major violent extremists, including Patrick Crusius, Brenton
Tarrant, and Anders Breivik.[92]
Figure 4: Bar graph showing the breakdown of ideology type across the data set of Telegram channels.

ISSN 2334-3745 111


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Movement Affiliations
In total, 47 of the 125 channels in the data set were found to be associated with known hate-based movements
(see Figure 5).
The Proud Boys movement, which continued to make national news as protests erupted in Portland, WA,
was one of the most prominent movements active on Telegram, with eight channels connected to the group.
Atomwaffen, a widely followed neo-Nazi accelerationist movement that originated in Texas, was also represented
with 8 channels.[93] While Atomwaffen had a relatively large number of channels, not all of them were highly
active or had large followings. In fact, the Atomwaffen channel with the largest following of 898 subscribers has
not been active since November 29, 2019.
However, 10 days prior to ceasing activity, another Atomwaffen channel was created under the same name.
The bio of this channel reads, “New archive because the old one is going to be deleted”. It is not uncommon for
admins to create new pages and re-share content from the old channel to the new channel if they suspect the
original has been infiltrated by “feds” or was flagged by Telegram. This behavior elucidates the “whack-a-mole”
problem with violent extremist activity on social media. Just as quickly as one account can cease activity or be
shut down, another with the exact same content and propaganda can be created. This pattern was continuously
noted in this Telegram study, as many channels that were found to be inactive would link followers to their new
channel names before going dark. Simply removing a channel does not necessarily cut off access to an audience.
QAnon-related channels closely followed in frequency with seven channels dedicated to new Q “drops” and
conversations regarding the conspiracy. The QAnon channels with discussion capabilities were often engaged in
disinformation campaigns and with news articles being posted from fake or unreliable sources. The Boogaloo
movement, Skinhead Nation, and Patriot Front all had operating channels that were actively creating original
content to attract new members. Other movements represented include National Alliance, Antifa, Pine Tree
Party, League of the South, Aryan Nations, New Black Panther Party, Stormfront, National Vanguard, National
Socialist Movement, National Justice Party, the Vinlanders Social Club, and the Base.
Figure 5: Figure showing the number of Telegram channels that were found to be affiliated with known hate
groups, conspiracy followings, and violent extremist movements in the United States. Fourty-seven of the 125
channels in the data set were found to be associated with known movements.

ISSN 2334-3745 112


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Disinformation Campaigns and Conspiracy Theories


In line with findings by Guhl & Davey (2020), explicit anti-minority ideas, hostility toward minorities, and
racist slurs were abundantly present in our data set. The study also supported Holbrook’s (2019) analysis, which
found that the majority of content online may not be facilitative in nature but rather more ideological and
part of a larger discussion that justifies future violence without actually making any definite plans or means to
commit acts of violence.[94] Specifically in relation to the far-right network on Telegram, the authors found
that there were very explicit racist ideas present, as well as influential disinformation campaigns that were
particularly engaging. QAnon was the most popular conspiracy theory to surface on the channels; however,
some of the channels simply shared the latest “Q drop” without a chat function. QAnon channels often aimed
to discredit traditional liberal politicians, claiming they were members of a deep state cabal of pedophiles.
Further, Q-adherents expressed the belief that former President Trump was the only savior combatting the
deep-state and was communicating with “Q” through tweets.
There was also a dangerous intersection between QAnon and COVID-19 conspiracies. Many users on Telegram
spread anti-vaccination propaganda in regard to COVID-19, urging followers not to participate in social
distancing or adhere to mask guidelines, claiming the virus was not real, or urging followers to not follow
guidelines as these allegedly infringe on their human rights.
The largest source of disinformation surrounded the COVID-19 pandemic. The far-right has been the most
active influencer of misinformation and propaganda sharing, especially online. When combined with how the
pandemic itself fuels more online activity, as thousands are laid off and more likely to find themselves sitting at
home facing a computer screen, it becomes a great deal easier for online recruiters to attract more people and
expose them to disinformation campaigns.[95] Miranda Christou (2020) argued that “the radical right does
not simply reject science; it invents its own scientific rhetoric to provide an ‘alternative interpretation’”, and that
is exactly what is happening due to the coronavirus on Telegram channels.[96] Telegram is overwhelmed with
conversations about how the virus is a deep state control tactic, a control tactic used by elites like Bill Gates and
Jeff Bezos to wipe out the population and train individuals into obedient slaves, that masks are ineffective, or
that nobody has actually died from the virus. Oftentimes links to fringe healthcare sites are shared on Telegram
channels. Others are using memes and videos to delegitimize the actual health guidelines necessary to control
the spread of the virus. These belief systems continue to threaten public health and build distrust of science,
political institutions, and health-care institutions.
With QAnon and COVID-19 dominating the majority of the conspiracy conversations, the “boogaloo” beliefs
did not come up as much as the authors of the study had expected. However, there were certainly still many
users who expected a civil war was coming. One anti-government channel shared several audiobook-style
downloads explaining exactly how the breakup of America was going to happen. Many other channels shared
this sentiment and directly predicted that the 2020 presidential election was going to increase violence, and
potentially spiral into a civil conflict.

Accelerationism
Proud Boys, Pine Tree Party, and many of the White supremacist–related channels expressed that accelerationism
of chaos, conflict, and violence is a goal of many of their off-line actions. Posts included propaganda with the
caption “an appeal to acceleration” and users explaining how they wanted to see former President Trump
win the 2020 presidential election “because of the apocalyptic meltdowns from shitlibs, which would result
in them becoming more openly radical, and their radicalism gives us more justification to resist and counter
them”. Several channels continue to discuss which presidential candidate is the “acceleration option.” Many
channels are spreading messages and sharing information such as “the more people realize that there is no
political solution, the more of them will start acting accordingly”. While these messages are not discrete plans
for attacks, they elucidate a potential desire for future conflict as more and more users are beginning to believe
that no politics can solve the tensions in the United States.

ISSN 2334-3745 113


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

The most notable calls to action were in the Proud Boys channels after the first presidential debate, namely when
former President Donald Trump made reference to the group. Channels quickly responded by making t-shirts
and symbols quoting the former President, warning followers to “get ready” and “prepare” for a looming war, as
well as users suggesting “Let’s go back to Portland”. Other comments included users suggesting that individuals
conduct actions as they see fit, either alone or with a small group of friends in order to avoid increased
surveillance by federal agents. This call for lone-actor violence is one that ought to be monitored. As public
awareness of some violent extremist movements heightens, movements may attempt to move underground
further. Some Proud Boys chapters released statements saying that they were not racist or violent; however, the
channel’s actual content suggests otherwise. While some followers may have originally joined the group when
it was considered a chauvinist drinking club, it is evident that the movement has evolved and gained followers
with racist and violent ideologies.
A common theme of channels was to glorify past lone-actors, calling them “Heroes” and “Saints” of the greater
White supremacist movement. A list circulated on many channels, dubbed many violent domestic terrorists and
mass murders, such as Charles Ray, the assassin of Martin Luther King Jr., Theodore Kaczynski, Eric Rudolph,
Anders Breivik, Dylann Roof, and Tobias Rathjen, as “saints” (see Appendix B). The authors of this study believe
this is the same list identified by Guhl & Davey (2020) in their report on White supremacist Telegram channels.
While this repeated finding is not surprising considering the interconnectedness of Telegram channels, it does
suggest that this content continues to be circulated unabatedly, despite potentially breaching Telegram’s terms
of service.
Many of the contemporary violent extremists on the list utilized alternative platforms such as Telegram, 8chan,
and 4chan to espouse their hate for the groups or individuals that they ended up targeting in their attacks.[97]
These perpetrators are seen as inspiration for new extremists. There is a pattern of online activity inspiring one
individual to take action, who then inspires the next perpetrator. Examples include the Unabomber, whose
manifesto was published in the Washington Post, later being used as inspiration for Brenton Tarrant, who
in turn inspired Patrick Crusius.[98] Glorification of violence is playing a large role in the radicalization of
new violent extremists and must continue to be analyzed to better understand what type of potential counter
messages can be utilized to stop this cycle.

Discussion
The findings of this study largely supported the authors hypotheses that Telegram is an actively growing
environment for US-based hate speech and extremism with a range of ideologies present While more far-right
content was present in this study, the radicalization is occurring on both sides of the political spectrum.
Perhaps the most critical finding of this study is that the United States is facing a pressing disinformation
crisis which is being exponentially furthered by the increased usage of alternative social media platforms such
as Telegram. There is an underlying connection by alt-right news media sources and far-right social media
users. The biased news being produced is shared on social media platforms such as Telegram, introducing
it into the daily lives of users. By questioning the authority of traditional media, the alternative platforms
not only build their own sense of legitimacy but also create distrust of mainstream media that share truthful
information.[99] This has led to a widespread decline in media trust and gives rise to alternative conservative
media, disinformation, and conspiracies gaining influence in national politics.
QAnon best exemplifies how disinformation is delegitimizing democratic institutions and gaining influence in
politics. The conspiracy has effectively seeped into the United States government, with more than a dozen US
House and Senate candidates who have openly endorsed the conspiracy running for office.[100] There have
also been several criminal and/or violent incidents related to QAnon beliefs, including several kidnappings and
threats to prominent politicians.[101]
Further, the spread of disinformation is having detrimental effects on public health. The far-right has continued

ISSN 2334-3745 114


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

to undermine and discredit mainstream medical science, education, and public policies in relation to the
COVID-19 pandemic. The rhetoric online, contending that the virus was created by politicians to control
citizens, has convinced many Americans not to follow social distancing and mask use guidelines and spread
anti-vaccination beliefs. Some far-right members have gone so far as to weaponize COVID-19. Reports from
the United Kingdom noted that far-right movements operating online had been encouraging followers to
“intentionally infect Jews and Muslims”.[102] In this sense, the virus is exploited to further a xenophobic, racist,
and violent ideological agenda. In March 2020, a Missouri man with racist and anti-government beliefs went
so far as to plan to bomb a hospital that was facing full capacity with COVID-19 patients.[103] Another man
in Massachusetts with White supremacist views also planned on bombing a Jewish assisted living community
that had reached capacity due to COVID-19.[104] Hate crimes against Asian-Americans have also increased
since the onset of the pandemic.[105]
The growing distrust of experts and political leaders and the momentum that conspiracy theories are gaining in
the United States is one of the most pressing threats to its democracy and national security. Alternative social
media, when used for the purposes of extremist dialogues and movements, are accelerating this distrust through
echo chambers of hate, disinformation, conspiracies, and accelerationism. The solution is not necessarily to
censor speech and political dialogue further, but to focus efforts on building trust of traditional media sources
that are spreading information based on science and empirical facts and not on mere opinions and beliefs.
Further, the country as a whole needs to examine factors contributing to its increased polarization, which
limits the political middle ground and the public space for tolerant bipartisan dialogue.

Limitations
While there was a large percentage of far-right channels in this study, it should be taken into account that there
are other keywords that could be used to find channels related to far-right activity. One of the limitations of this
study was the lack of knowledge surrounding similar keywords and catchphrases used by far-left movements.
Many far-right movements have been well studied and patterns such as phrases, hate symbols and numbers,
and even clothes and music choices have been well documented. While it is known that many far-left ideologies
consist of anti-fascist and anarchism-related ideas, their evolving phrases and symbols are less well known.
Future studies should continue to identify and analyze the speech patterns, symbology, and code words of far-
left movements. Many of the channels were found through the snowball technique, revealing the connectedness
among the far-right channels. The far-left channels found in this study did not reveal the same connectivity or
lead the authors to more channels with similar ideologies.
Further, the authors only tracked a seven-day growth change, and while there was higher engagement,
increased longitudinal measures should be considered. By measuring channel size over greater periods of time,
researchers can gain further insight into the rates at which certain ideologies are growing. Longer time intervals
may also reveal insights into the lifespan of hate-based Telegram channels, as channels may be removed or
become inactive over time.
While it is important for P/CVE stakeholders to understand how Telegram influences the spread of far-right
ideologies, far-left online behavior must be examined further, especially as tensions continue to rise between
anti-fascist groups and far-right groups in the United States. If far-left conversations are not widely taking place
in spaces such as Telegram, they are taking place elsewhere. Understanding differences in patterns of behavior
across different ideologies is necessary to understand how different movements will recruit, mobilize, and
create potential violence, and in turn allow P/CVE stakeholders to thwart those efforts.

Conclusion
This study situated Telegram into continuing debates over the constitutionality of free speech versus content
removal by social media platforms, the latter of which leads to deplatforming and gives rise to alternative
platforms. The formation of alternative media is reactive to the increased polarization within the United States,

ISSN 2334-3745 115


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

both off-line and on mainstream platforms that censor specific content.[106] The authors’ findings support
previous literature suggesting that grievances by prominent far-right actors over being banned from mainstream
platforms has an impact on the number of users, activity, and effectiveness of propaganda on alternative sites like
Telegram.[107] This has led to further deplatforming by masses of far-right users of mainstream sites and also
led to an overall network evolution to alternative spaces that may be less detectable or conducive for counter-
narratives. Overall, the study reported in this article supports the idea that de-platforming has a limited effect
on decreasing the presence of extremist content online and its effects on radicalization; it simply shifts the
problem to a different space. Therefore, in order to curb the migration trends and its detrimental effects, greater
inclusivity and tolerance is needed within the mainstream dialogue online. While counter-narrative options for
Telegram and other alternative platforms should be explored, the magnitude of echo chambers and groupthink
patterns on Telegram may perhaps already be too much for counter-narratives to make a significant impact.
This, in part, is due to the fact that by the time individuals turn to alternative platform channels that are clearly
extreme, their ideologies and views are generally fixed. These users have shown that they do not want to hear an
alternative narrative and chose to move to spaces that only propagate their viewpoints through echo chambers.
Without motivation to change their minds, the messages of any counter-narrative may simply be removed or
disregarded. 
In terms of solutions to stop the spread of disinformation and terrorist content, the authors of the study
recommend a hybrid cascading strategy, as described by van der Vegt (2019) for content-takedown to be
implemented within Telegram.[108] Automated detection infrastructure that can target material related to
violent White supremacist content can help to mitigate widespread sharing.[109] However, there are major
concerns over the accuracy of fully automated approaches to content removal. While automated content-
removal strategies may be great for detecting masses of content, human decision-making cannot be understated
as an integral part to combating terrorist and extremist content online. Humans are better able to understand
nuances in speech and the context of content, better informing decisions over content takedown that protect
against terrorism while at the same time protect rights to free speech.[110] However, smaller platforms may
not have the resources to put together large teams of human-based content-review boards. This is where a
hybrid between human and automated strategies is going to be most beneficial.[111] Automation can detect
content as long as it is up to date on terminology trends, while human beings can make the final decision, based
on language, context, and other pieces of the behavior surrounding a particular case.[112]
Further, when utilizing databases to detect and remove content, there should be a broader set of terms in
the database to include more borderline content.[113] Databases should also be consistently updated as new
movements arise. For example, the QAnon conspiracy theory movement has brought out many unique phrases
that do not seem violent on the surface level but that point to adherence to the conspiracy and future off-line
actions. Van der Vegt (2019) found that some Internet review units (IRU), which are select teams established by
law enforcement agencies dedicated to content moderation, are mostly concerned with removing propaganda
that bears the ‘brand’ of a terrorist organization. IRUs do not decide whether a social media platform removes
content, but flag questionable content and relay their findings to the social media platform for the final say. While
it is important to remove terrorist propaganda that is connected to a specific brand of a terrorist organization,
this literature points to an evolution where the network is decentralized and disconnected and may not relate
to specific terrorist organizations. In this sense the automated and human-based decisions on content removal
must take into consideration that these networks are less organized than traditional transnational groups.
When it comes to far-right content, the networks may be sharing a great deal of content with each other, but
they span a range of ideologies and specific organization connections. Thus, the content-removal strategies have
to be able to target content even when it is not as easily identifiable in connection with a specific organization.

There is also a clear need for greater global consensus and information sharing between large and small tech
companies to combat terrorist content. The push to strengthen Telegram’s enforcement of its terms of service
should come from large companies in the Global Internet Forum to Counter Terrorism (GIFCT), which is
tasked with assisting smaller companies. This forum was founded by major digital platforms—Facebook,

ISSN 2334-3745 116


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Microsoft, Twitter, and YouTube—in 2017 to foster collaboration among companies to counter terrorists and
violent extremists from exploiting the digital ecosystem, as well as share relevant information with smaller
platforms.[114] As disinformation online becomes more of a threat, it is necessary to consider how important
these global collaborations are for maintaining a standard across all types of digital platforms. When terrorist
organizations and other extreme movements identify a platform as a convenient space to spread content,
smaller platforms can produce an even larger risk than major platforms.
As emphasized by Conway et al. (2019), consistent content takedown strategies and constant enforcement
of terms of service on Twitter has produced significant effects in terms of disrupting the ISIS community
on Twitter, to the point that the ISIS Twitter English-language community is nearly nonexistent. Accounts
get taken down often within minutes or days, highlighting the importance of content removal across time.
[115] However, the same study noted that as Twitter was able to curb its ISIS community, Telegram became
the new home. This again highlights the need to target stricter guidelines on Telegram specifically, as it is
becoming the desired digital environment across extremist networks and ideologies. Similar findings were
made by Amarasingam (2021) in a study that specifically sought to disrupt ISIS ecology on Telegram. The study
found that while disruption attempts in 2019 may not completely remove the entire ecology, they did have a
profound effect on the reach and their lifespan.[116] In the 30 days following the coordinated takedown Action
Day, there was a 65.5% drop in organic ISIS posts, a 94.7% drop in forwarded posts, and new channels in 2019
had a significantly shorter lifespan.[117] While Telegram has notably stepped up its game with regard to ISIS-
specific content removal and account suspension, it is evident that it ought to take the same steps for extreme
far-right movements.
Solution attempts also need to target the cyclical nature of content removal, migration to alternative platforms,
and early forms of radicalization. While violent videos, images, and targeted violent extremist speech should be
removed from online platforms, moderators also need to ensure that spaces are made available for intergroup
contact and perspective taking.[118] When individuals are able to remove political blinders and consider
the perspectives of others more openly, many, perhaps most, people are able to accept more moderate views.
[119] By targeting the political polarization in the United States, P/CVE and political experts can decrease
the likelihood of falling victim to disinformation, conspiracy theories, and alternative platforms in the first
place. Countermeasures are certainly necessary, but it is even more critical to take preventative measures when
combating the migration to alternative social media and its consequences. 

Acknowledgments
The authors would like to thank Ardian Shajkovci, ACTRI Director, and Allison McDowell-Smith, ACTRI
Deputy Director, for their assistance with content and methodology.

About the Authors:


Samantha Walther is a research fellow at the American Counterterrorism Targeting and Resilience Institute
(ACTRI), where she researches the role of women in far-right extremist groups and movements, with a comparative
focus lens on the role of women in both far-right and violent jihadist groups. She also explores the characteristics
and the trends of the far-right presence online through the lens of group polarization theory. Samantha served as
the program coordinator for the Institute for Women’s Leadership at Nichols College, with previous experience
researching policy and criminal justice issues at domestic and international levels. She received a Bachelor of Arts
in government from Hamilton College in 2018 and a Master of Science in counterterrorism from Nichols College
in 2020. Her previous research topics include a field study in Sweden on criminal justice reform and immigration,
civil liberties violations of Muslim-Americans since 9/11, as well as work on the Nuremberg Trials and human
rights—a study that was awarded the 2017 Raphael Lemkin Essay Prize on the Practice of Genocide.
Andrew McCoy is a research fellow at the American Counterterrorism Targeting and Resilience Institute (ACTRI),

ISSN 2334-3745 117


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

where he researches far-right and militant jihadi activity online. He also assists with data collection for the
upcoming ACTRI database. Andrew is working toward the completion of a degree in justice, law, and criminology,
with a focus on counterterrorism, at American University, Washington, DC, and is currently pursuing his master’s
studies. He has interned at the Worcester County District Attorney’s Office, working on investigative analysis.

Notes
[1] Kristoffer Holt (2018), “Alternative Media and the Notion of Anti-Systemness: Towards an Analytical Framework”, Media and
Communications, 6:4, pp. 49–57. DOI: 10.17645/mac.v6i4.1467; Chris Atton (2006) “Far-right media on the internet: culture,
discourse, and power,” New Media & Society 8:4, pp. 573–587; DOI: 10.1177/1461444806065653; Natalie Fenton & Veronica
Barassi (2011) “Alternative Media and Social Networking Sites: The Politics of Individuation and Political Participation”, The
Communication Review 14:3, pp. 179–196. DOI: 10.1080/10714421.2011.597245. 
[2] Olga Guedes Bailey, Bart Cammaerts & Nico Carpentier (2008), Understanding Alternative Media, Berkshire: McGraw-Hill. 
[3] Kristoffer Holt, op. cit., p. 52.
[4] Emily A. Vogels, Andrew Perrin, and Monica Anderson (2020) “Most Americans Think Social Media Sites Censor Political
Viewpoints,” Pew Research Center, URL: https://www.pewresearch.org/internet/2020/08/19/most-americans-think-social-media-
sites-censor-political-viewpoints/.
[5] Emily A. Vogels, Andrew Perrin, and Monica Anderson (2020), op. cit.
[6] Richard Rogers (2020), “Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media,”
European Journal of Communication 35:3, pp. 213–229; URL: https://doi.org/10.1177/0267323120922066.
[7] Richard Rogers, op. cit., p. 217. 
[8] Richard Rogers, op. cit., p. 214. 
[9] Cristiano Lima (2020) “GOP officials flock to Parler social network. So do their trolls and impostors”, POLITICO; URL: https://
www.politico.com/news/2020/07/02/republicans-parler-trolls-347737.  
[10] @DevinNunes (Devin Nunes). “Eerily quiet here in the Twitter Sewer. Thought for a second I went to Myspace by
accident. Many people I follow have been vaporized by left wing tech tyrants. Hopefully they will reappear on Parler and
Rumble.…”, Twitter. 11 Nov. 2020, 8:37 p.m., URL: https://twitter.com/DevinNunes/status/1326700669579882497.
[11] Richard Rogers, op. cit., p. 214. 
[12] Olga Guedes Bailey, Bart Cammaerts & Nico Carpentier, op. cit., p. 9. 
[13] Frances J. Berrigan (1979) “Community Communications: The Role of Community Media in Development,” Reports and
Papers on Mass Communication, 90 as cited in Olga Guedes Bailey, Bart Cammaerts & Nico Carpentier (2008), op. cit., p. 13. 
[14] Renaud de la Brosse, Kristoffer Holt & Andre Haller (2019), “The “Other” Alternatives: Political Right-Wing Alternative
Media,” Journal of Alternative & Community Media 4:1, pp. 1–6; DOI: 10.1386/joacm_00039_2. 
[15] Kristoffer Holt, op. cit., p. 51. 
[16] Bennett Clifford and Helen Powell (2019), “Encrypted Extremism: Inside the English-Speaking Islamic State Ecosystem on
Telegram”, The George Washington University Program on Extremism; URL: https://extremism.gwu.edu/sites/g/files/zaxdzs2191/f/
EncryptedExtremism.pdf; Ahmad Shehabat, Teodor Mitew & Yahia Alzoubi (2017), “Encrypted Jihad: Investigating the Role of
Telegram App in Lone Wolf Attacks in the West”, Journal of Strategic Security, 10:3, pp. 27–53.
[17] Maura Conway (2017), “Determining the Role of the Internet in Violent Extremism and Terrorism: Six Suggestions for
Progressing Research,” Studies in Conflict & Terrorism 40:1, pp. 77–98; DOI. 10.1080/1057610X.2016.1157408; Ryan Scrivens, Paul
Gill, Maura Conway (2020) “The Role of the Internet in Facilitating Violent Extremism and Terrorism: Suggestions for Progressing
Research”, The Palgrave Handbook of International Cybercrime and Cyberdeviance. Cham: Palgrave Macmillan; URL: https://doi.
org/10.1007/978-3-319-78440-3_61.
[18] Maura Conway, op. cit., p. 78.
[19] Daniel Koehler (2014), “The Radical Online: Individual Radicalization Processes and the Role of the Internet”, Journal for
Deradicalization 1, pp. 116–134.
[20] Stijn Sieckelinck, Elga Sikkens, Marion van San, Sita Kotnis, and Micha de Winter (2019), “Transitional Journeys Into and Out
of Extremism. A Biographical Approach,” Studies in Conflict & Terrorism 42, pp. 662–682; DOI: 10.1080/1057610X.2017.1407075.
[21] Tiana Gaudette, Ryan Scrivens, Vivek Venkatesh (2020), “The Role of the Internet in Facilitation Violent Extremism: Insights

ISSN 2334-3745 118


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

from Former Right-Wing Extremists,” Terrorism and Political Violence, Vol. 32, No. 5; DOI: 10.1080/09546553.2020.1784147.
[22] Maura Conway, op. cit., p. 84.
[23] Ryan Scrivens, Paul Gill, Maura Conway, op. cit., p. 1417.
[24] Maura Conway, op. cit., p. 83.
[25] “Telegram FAQ” (n.d.) Telegram, URL: https://telegram.org/faq#:~:text=All%20Telegram%20messages%20are%20
always,Telegram%20Cloud%20(more%20here).  
[26] Ibid.
[27] Bennett Clifford and Helen Powell (2019), “Encrypted Extremism: Inside the English-Speaking Islamic State Ecosystem on
Telegram”, The George Washington University Program on Extremism, URL: https://extremism.gwu.edu/sites/g/files/zaxdzs2191/f/
EncryptedExtremism.pdf.
[28] Bennett Clifford and Helen Powell, op. cit., p. 10. 
[29] Valerio Mazzoni (2019), ‘Far Right extremism on Telegram: A brief overview,” European Eye on Radicalization; URL: https://
eeradicalization.com/far-right-extremism-on-telegram-a-brief-overview/.
[30] Ibid.
[31] Ibid.
[32] Ibid.
[33] Mia Bloom, Hicham Tiflati & John Horgan (2017), “Navigating ISIS’s Preferred Platform: Telegram,” Terrorism and Political
Violence Vol. 31, No. 6, pp. 1242–1254; DOI: 10.1080/09546553.2017.1339695.
[34] Mia Bloom, Hicham Tiflati & John Horgan, op. cit., p. 4.
[35] Silvia Semenzin & Lucia Bainotti (2020), “The use of Telegram for the non-consensual dissemination of intimate images:
gendered affordances and the construction of masculinities”; URL: https://doi.org/10.31235/osf.io/v4f63. 
[36] Bennett Clifford and Helen Powell, op. cit., p. 8; Ahmad Shehabat, Teodor Mitew & Yahia Alzoubi, op. cit., p. 35; Mia Bloom,
Hicham Tiflati & John Horgan, op. cit., p. 1.
[37] Ahmad Shehabat, Teodor Mitew & Yahia Alzoubi, op. cit., p. 51. 
[38] Alexander Meleagrou-Hitchens and Seamus Hughes, “The Threat to the United States from the Islamic State’s Virtual
Entrepreneurs”, CTC Sentinel 10:3 (March 2017); available at URL: https://ctc.usma.edu/posts/the-threat-to-the-unitedstates-from-
the-islamic-states-virtual-entrepreneurs.
[39] “Alt-Right Encrypted: How Far-Right Extremists’ Migration to Telegram Has Reinforced their Terrorist Threat”,
(n.d.) SITE Intelligence Group, URL: https://ent.siteintelgroup.com/Press-Release/alt-right-encrypted-how-the-far-right-s-
migration-to-telegram-is-growing-its-terrorist-threat.html; as cited by Rita Katz (2019), “Telegram has finally cracked down
on Islamist terrorism. Will it do the same for the far-right?” The Washington Post, URL: https://www.washingtonpost.com/
opinions/2019/12/05/telegram-has-finally-cracked-down-islamist-terrorism-will-it-do-same-far-right/.
[40] Idem.
[41] Idem.
[42] Jakob Guhl & Jacob Davey (2020), “A Safe Space to Hate: White Supremacist Mobilisation on Telegram”, ISD Global; URL:
https://www.isdglobal.org/wp-content/uploads/2020/06/A-Safe-Space-to-Hate2.pdf; Aleksandra Urman & Stefan Katz (2020),
“What they do in the shadows: examining the far-right networks on Telegram,” Information, Communication & Society; DOI:
10.1080/1369118X.2020.1803946.
[43] Jakob Guhl & Jacob Davey, op. cit., pp. 1–2.
[44] Mia Bloom, Hicham Tiflati & John Horgan, op. cit., p. 9.
[45] Ibid.
[46] Ibid.
[47] Ibid.
[48] Donald Holbrook (2019), “The Terrorism Information Environment: Analyzing Terrorists’ Selection of Ideological and
Facilitative Media,” Terrorism and Political Violence; DOI: 10.1080/09546553.2019.1583216.
[49] Donald Holbrook, op. cit., p. 5.

ISSN 2334-3745 119


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

[50] Jakob Guhl & Jacob Davey, op. cit., p. 1.


[51] Jakob Guhl & Jacob Davey, op. cit., p. 11.
[52] Olga Guedes Bailey, Bart Cammaerts & Nico Carpentier, op. cit., p. 6. 
[53] Ana-Maria Bliuc, John Betts, Matteo Vergani, Muhammad Iqbal & Kevin Dunn, “Collective identity changes in
far-right online communities: The role of offline intergroup conflict”, New Media & Society 21:8, pp. 1770–178; DOI:
10.1177/1461444819831779. 
[54] Markus Kaakinen, Atte Oksanen & Pekka Rasanen, “Did the risk of exposure to online hate increase after the November
2015 Paris attacks? A group relations approach”, Computers in Human Behavior 78, pp. 90–97; URL: https://doi.org/10.1016/j.
chb.2017.09.022.
[55] Markus Kaakinen, Atte Oksanen & Pekka Rasanen, op. cit., p. 90. 
[56] Ana-Maria Bliuc, John Betts, Matteo Vergani, Muhammad Iqbal & Kevin Dunn, op. cit., p. 1773. 
[57] Seth G. Jones, Catrina Doxsee & Nicholas Harrington (2020), “The Escalating Terrorism Problem in the United States”, CSIS
Report; URL: https://csis-website-prod.s3.amazonaws.com/s3fs-public/publication/200612_Jones_DomesticTerrorism_v6.pdf.
[58] Idem, p. 6.
[59] Katharine Gelber (2019), “Terrorist-Extremist Speech and Hate Speech Understanding the Similarities and Differences,”
Ethical Theory and Moral Practice 22, pp. 607–622; URL: https://doi.org/10.1007/s10677-019-10013-x.
[60] “United Nations Strategy and Plan of Action on Hate Speech” (2019), United Nations. URL: https://www.un.org/en/
genocideprevention/documents/UN%20Strategy%20and%20Plan%20of%20Action%20on%20Hate%20Speech%2018%20June%20
SYNOPSIS.pdf.  
[61] Jan-Willem van Prooijen & Andre P.M. Krouwel (2019), “Psychological Features of Extreme Political Ideologies”, Current
Directions in Psychological Science 28:2, pp. 159–163; DOI: 10.1177/0963721418817755.
[62] Deen Freelon, Alice Marwick, and Daniel Kreiss (2020), “False equivalencies: Online activism from left to right,” Science,
369:6508, pp. 1197–1203. 
[63] Emily A. Vogels, Andrew Perrin, and Monica Anderson (2020), “Most Americans Think Social Media Sites Censor Political
Viewpoints,” Pew Research Center; URL: https://www.pewresearch.org/internet/2020/08/19/most-americans-think-social-media-
sites-censor-political-viewpoints/.
[64] Emily A. Vogels, Andrew Perrin, and Monica Anderson (2020), op. cit.; Deen Freelon, Alice Marwick, and Daniel Kreiss
(2020), op. cit.
[65] Deen Freelon, Alice Marwick, and Daniel Kreiss (2020), op. cit. 
[66] Krzysztof Wasilewski (2019) “US alt-right media and the creation of the counter-collective memory,” Journal of Alternative and
Community Media, 4, pp. 77–91. 
[67] Krzysztof Wasilewski, op. cit., p. 78. 
[68] Kristoffer Holt, op. cit., p. 55. 
[69] Aleksandra Urman & Stefan Katz, op. cit., p. 1.
[70] Yochai Benkler, Robert Faris, and Hal Roberts (2018) “Network Propaganda: Manipulation, Disinformation, and
Radicalization in American Politics”; Oxford Scholarship Online, URL: https://oxford.universitypressscholarship.com/view/10.1093/
oso/9780190923624.001.0001/oso-9780190923624-chapter-11.
[71] Deen Freelon, Alice Marwick, and Daniel Kreiss (2020), op. cit. 
[72] Pete Burnap, Matthew L. Williams, Luke Sloan, Omer Rana, William Housley, Adam Edwards, Vincent Knight, Rob Proctor
& Alex Voss (2014), “Tweeting the Terror: modelling the social media reaction to the Woolwich terrorist attack,” Social Network
Analysis and Mining 4:206; DOI 10.1007/s13278-014-0206-4. 
[73] Alice Marwick and Rebecca Lewis (n.d.), ‘Media Manipulation and Disinformation Online,” Data & Society Research Institute,
URL: http://www.chinhnghia.com/DataAndSociety_MediaManipulationAndDisinformationOnline.pdf.
[74] van Prooijen, J.-W. & Douglas, K. M. (2017, June 29), “Conspiracy theories as part of history: The role of societal crisis
situations”. Memory Studies 10 (3), pp. 323–333; URL: https://doi.org/10.1177/1750698017701615.
[75] Karen M. Douglas, Joseph E. Uscinski, Robbie M. Sutton, Aleksandra Cichocka, Turkey Nefes, Chee Siang Ang & Farzin
Deravi (2019), “Understanding Conspiracy Theories”, Advances in Political Psychology 40:1, pp. 3–35; DOI: 10.1111/pops.12568;
Ayan Mohammed, Amanda Garry, Rukaya Mohamed & Antoine Andary (2020) “Extremist Manifestation: Translation of

ISSN 2334-3745 120


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Conspiracy Theories”, American Counterterrorism Targeting and Resilience Institute; URL: https://americanctri.org/wp-content/
uploads/2020/11/Extremist-Manifestation-and-Translation-of-Conspiracy-Theories-Perspective.pdf.
[76] Daniel L. Byman (2020), “Riots, white supremacy, and accelerationism,” Brookings, URL: https://www.brookings.edu/blog/
order-from-chaos/2020/06/02/riots-white-supremacy-and-accelerationism/; “White Supremacists Embrace ‘Accelerationism,’”
(2019), Anti-Defamation League, URL: https://www.adl.org/blog/white-supremacists-embrace-accelerationism.
[77] Daniel L. Byman, op. cit.; “White Supremacists Embrace ‘Accelerationism,’” op. cit. 
[78] Daniel L. Byman, op. cit; Andrew McCoy and Amanda Garry, Rukaya Mohamed (2020), “The Threat and Impact of Edicts,
Fatwas, and Manifestos on Violent Extremism”, American Counterterrorism Targeting and Resilience Institute; URL: https://
americanctri.org/wp-content/uploads/2020/12/ACTRI-Report-Edicts-Fatwas-and-Manifestos-McCoy-and-Garry.pdf.
[79] Daniel L. Byman, op. cit.
[80] One Hundred Sixteenth Congress of the United States House of Representatives Committee on Oversight and Reform (2020);
URL: https://oversight.house.gov/sites/democrats.oversight.house.gov/files/2020-06-15.CBM%20JR%20SFL%20to%20Wolf-
DHS%20re%20White%20Supremacists%20at%20Protests.pdf.
[81] Samantha Walther (2020), “The Nature of Far-Right Extremism Online Post-2015 in the United States”, American
Counterterrorism Targeting & Resilience Institute; URL: https://americanctri.org/wp-content/uploads/2020/08/ACTRI-Report-The-
Nature-of-Far-Right-Extremism-Online-by-Samantha-Walther.pdf.
[82] Daveed Garstenstein-Ross & Samuel Hodgson (2020), “Analysis: The Growing Threat Posed by Accelerationism and
Accelerationist Groups Worldwide”, Foreign Policy Research Institute, URL: https://www.fpri.org/article/2020/04/the-growing-
threat-posed-by-accelerationism-and-accelerationist-groups-worldwide/.
[83] Silvia Semenzin & Lucia Bainotti (2019), “Dark Telegram,” Digital Methods Initiative, URL: https://wiki.digitalmethods.net/
Dmi/SummerSchool2019DarkTelegram.
[84] Silvia Semenzin & Lucia Bainotti (2019), op. cit. 
[85] Idem. 
[86] Jakob Guhl & Jacob Davey, op. cit., p. 3.
[87] David Plowright (2011), Using Mixed Methods: Frameworks for an Integrated Methodology. London: SAGE Publications Ltd.
[88] David Plowright, op. cit., pp. 70–71. 
[89] Ardian Shajkovci, Allison McDowell-Smith, and Mohamed Ahmed (2020), “Eco-Fascist ‘Pine Tree Party’ Growing as a Violent
Extremism Threat”, Homeland Security Today; URL: https://www.hstoday.us/subject-matter-areas/counterterrorism/eco-fascist-
pine-tree-party-growing-as-a-violent-extremism-threat/.
[90] Idem. 
[91] Michael Zimmerman (1995), “The Threat of Ecofascism,” Social Theory and Practice 21:2, pp. 207–238; URL: http://www.jstor.
org/stable/23557116.
[92] Ardian Shajkovci, Allison McDowell-Smith, and Mohamed Ahmed (2020), op. cit. 
[93] “Atomwaffen” (n.d.) Southern Poverty Law Center; URL: https://www.splcenter.org/fighting-hate/extremist-files/group/
atomwaffen-division.
[94] Donald Holbrook, op. cit., p. 4.
[95] Emily A. Vogels, Andrew Perrin, Lee Rainie & Monica Anderson (2020), “53% of Americans Say the Internet Has Been
Essential During the COVID-19 Outbreak”, Pew Research Center; URL: https://www.pewresearch.org/internet/2020/04/30/53-of-
americans-say-the-internet-has-been-essential-during-the-covid-19-outbreak/.
[96] Miranda Christou (2020), “Is the radical right spreading the coronavirus?” Open Democracy; URL: https://www.
opendemocracy.net/en/countering-radical-right/radical-right-spreading-coronavirus/.
[97] Jake Kanter (2019), “The ‘cesspool of hate’ message board 8chan was taken offline after being linked to 3 mass shootings
this year,” Business Insider; URL: https://www.businessinsider.com/el-paso-shooting-8chan-down-after-cloudflare-terminated-
support-2019-8; Gianluca Mezzofiore & Donie O’Sullivan (2019), “El Paso mass shooting is at least the third atrocity linked to
8chan this year”, CNN, URL: https://www.cnn.com/2019/08/04/business/el-paso-shooting-8chan-biz
[98] Ardian Shajkovci, Allison McDowell-Smith, and Mohamed Ahmed (2020), op. cit. 
[99] Tine Ustad Figenschou & Karoline Andrea Ihlebaek (2019) “Challenging Journalistic Authority: Media criticisms in far-right
alternative media”, Journalism Studies 20:9, pp. 1221–1237; URL: https://doi.org/10.1080/1461670X.2018.1500868.

ISSN 2334-3745 121


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

[100] “5 facts about the QAnon conspiracy theories” (2020), Pew Research Center; URL: https://www.pewresearch.org/fact-
tank/2020/11/16/5-facts-about-the-qanon-conspiracy-theories/.
[101] Lois Beckett (2020), “QAnon: a timeline of violence linked to the conspiracy theory”, The Guardian; URL: https://www.
theguardian.com/us-news/2020/oct/15/qanon-violence-crimes-timeline.
[102] Aris Folly (2020) ‘FBI warns white supremacists encouraging members to spread coronavirus to law enforcement, Jews:
report”, The Hill; URL: https://thehill.com/blogs/blog-briefing-room/news/488919-fbi-white-supremacists-encouraging-members-
to-spread.
[103] “Coronavirus: Man planning to bomb Missouri hospital killed, FBI says” (2020), BBC News; URL: https://www.bbc.com/
news/world-us-canada-52045958.  
[104] ‘East Longmeadow Man Charged with Attempted Arson at Longmeadow Assisted Living Residential Facility’ (2020),
Department of Justice U.S. Attorney’s Office District of Massachusetts; URL: https://www.justice.gov/usao-ma/pr/east-longmeadow-
man-charged-attempted-arson-longmeadow-assisted-living-residential.  
[105] Angela R. Gover, Shannon B. Harper, and Lynn Langton (2020), ‘Anti-Asian Hate Crime During the COVID-19 Pandemic:
Exploring the Reproduction of Inequality’, American Journal of Criminal Justice, pp. 1–21; URL: https://www.ncbi.nlm.nih.gov/
pmc/articles/PMC7364747/.
[106] Renaud de la Brosse, Kristoffer Holt & Andre Haller, op. cit., p. 3.
[107] Aleksandra Urman & Stefan Katz, op. cit., p. 1.
[108] Isabelle van der Vegt, Paul Gill, Stuart Macdonald & Bennett Kleinberg (2019), “Shedding Light on Terrorist and Extremist
Content Removal,” RUSI; URL: https://rusi.org/publication/other-publications/shedding-light-terrorist-and-extremist-content-
removal.
[109] Jakob Guhl & Jacob Davey, op. cit., p. 11.
[110] Isabelle van der Vegt, Paul Gill, Stuart Macdonald & Bennett Kleinberg, op. cit., 9.
[111] Idem, p. 7.
[112] Idem, p. 6.
[113] Idem, p. 10.
[114] Maura Conway, Moign Khawaja, Suraj Lakhani, Jeremy Reffin, Andrew Robertson & David Weir (2019), “Disrupting Daesh:
Measuring Takedown of Online Terrorist Material and Its Impacts”, Studies in Conflict & Terrorism 42:1–2, pp. 141–160; DOI:
10.1080/1057610X.2018.1513984.
[115] Maura Conway, Moign Khawaja, Suraj Lakhani, Jeremy Reffin, Andrew Robertson & David Weir, op. cit.
[116] Amarnath Amarasingam, Shiraz Maher & Charlie Winter (2021), “How Telegram Disruption Impacts Jihadist Platform
Migration”, Centre for Research and Evidence on Security Threats; URL: https://crestresearch.ac.uk/resources/how-telegram-
disruption-impacts-jihadist-platform-migration/.
[117] Amarnath Amarasingam, Shiraz Maher & Charlie Winter, op. cit., p. 18.
[118] Lee De-Wit, Sander van der Linden & Cameron Brick (2019), “What Are the Solutions to Political Polarization?” Greater
Good Magazine Berkeley University; URL: https://greatergood.berkeley.edu/article/item/what_are_the_solutions_to_political_
polarization; Amanda Garry, Ardian Shajkovci, Allison McDowell-Smith, David RC Leary & Mohamed Ahmed (2020),
“Perspective on Counternarratives: Successes, Missing Links, and Opportunities”, American Counterterrorism Targeting and
Resilience Institute; URL: https://americanctri.org/wp-content/uploads/2020/10/ACTRI-Perpsective-on-Counternarratives-by-
Amanda-Garry.pdf.
[119] Thomas Strandberg, Jay A. Olson, Lars Hall, Andy Woods & Petter Johansson (2020), “Depolarizing American voters:
Democrats and Republicans are equally susceptible to false attitude feedback,” PLOS ONE 15:2; URL: https://doi.org/10.1371/
journal.pone.0226799.

ISSN 2334-3745 122


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Appendix A: Access Words Used to Locate Telegram Channels

Far-Right Far-Left General Topics


All Lives Matter Nazi Alt-Left 4chan
Amerika New Patriot Antifa America
Aryan Patriot Front Anti-fascist Black
Atomwaffen Pine Tree*** Black Lives Matter Catholic
Based Proud Boys Black Panther* Christianity
Black-Pilled Red-Pilled Left Civil War
Boogaloo Rhodesia Leftist Corona
Brenton Tarrant Roof Lib Coronavirus
Brotherhood Skinhead/Skinhead Nation Marxism COVID-19
Great Awakening Stormfront Diversity
Hitler Ted Kaczynski/The Unabomber Fascist
League of the South The Base Hate
Libtard Vinlanders Immigration
Make America White Q
National Alliance White Lives Matter QAnon
National Socialism** White Supremacy Shitpost
National Vanguard WWG1WGA****
*Used in reference to the extreme Black Nationalist New Black Panther Party and not related to the original and non-extreme Black
Panther Party

**Used in reference to the German Nationalsozialistische Deutsche Arbeiterpartei (NSDAP), otherwise known as the Nazi Party

***Used in reference to the novel eco-fascist Pine Tree Party

****Refers to the QAnon-related phrase “Where We Go One, We Go All”

ISSN 2334-3745 123


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms
PERSPECTIVES ON TERRORISM Volume 15, Issue 2

Appendix B: List of “Saints” Used to Accelerate and Inspire Violence

Source: Screenshot from Telegram Channel ‘The Bowlcast’

ISSN 2334-3745 124


This content downloaded from April 2021
182.18.206.24 on Thu, 19 Aug 2021 00:46:00 UTC
All use subject to https://about.jstor.org/terms

You might also like