Chapter |
The Social Structure of Extremist Websites
View Abstract
In this study, we select the official websites of four known extremist groups and map the networks of hyperlinked websites forming a virtual community around them. The networks are constructed using a custom-built webcrawler (TENE: Terrorism and Extremism Network Extractor) that searches the HTML of a website for all the hyperlinks inserted directing to other websites (Bouchard et al., 2014). Following all of these hyperlinks out of the initial website of interest produces the network of websites forming a community that is more or less cohesive, more or less extensive, and more or less devoted to the same cause (Bouchard and Westlake, 2016; Westlake and Bouchard, 2016). The extent to which the official website of a group contains many hyperlinks towards external websites may be an indicator of a more active community, and it may be indicative of a more active social movement.
|
2020 |
Bouchard, M., Davies, G., Frank, R., Wu, E. and Joffres, K. |
View
Publisher
|
Journal Article |
Preliminary Analytical Considerations In Designing A Terrorism And Extremism Online Network Extractor
View Abstract
It is now widely understood that extremists use the Internet in attempts to accomplish many of their objectives. In this chapter we present a web-crawler called the Terrorism and Extremism Network Extractor (TENE), designed to gather information about extremist activities on the Internet. In particular, this chapter will focus on how TENE may help differentiate terrorist websites from anti-terrorist websites by analyzing the context around the use of predetermined keywords found within the text of the webpage. We illustrate our strategy through a content analysis of four types of web-sites. One is a popular white supremacist website, another is a jihadist website, the third one is a terrorism-related news website, and the last one is an official counterterrorist website. To explore differences between these websites, the presence of, and context around 33 keywords was examined on both websites. It was found that certain words appear more often on one type of website than the other, and this may potentially serve as a good method for differentiating between terrorist websites and ones that simply refer to terrorist activities. For example, words such as “terrorist,” “security,” “mission,” “intelligence,” and “report,” all appeared with much greater frequency on the counterterrorist website than the white supremacist or the jihadist websites. In addition, the white supremacist and the jihadist websites used words such as “destroy,” “kill,” and “attack” in a specific context: not to describe their activities or their members, but to portray themselves as victims. The future developments of TENE are discussed.
|
2014 |
Bouchard, M., Joffres, K. and Frank, R. |
View
Publisher
|
Journal Article |
Discourse patterns used by extremist Salafists on Facebook: identifying potential triggers to cognitive biases in radicalized content
View Abstract
Understanding how extremist Salafists communicate, and not only what, is key to gaining insights into the ways they construct their social order and use psychological forces to radicalize potential sympathizers on social media. With a view to contributing to the existing body of research which mainly focuses on terrorist organizations, we analyzed accounts that advocate violent jihad without supporting (at least publicly) any terrorist group and hence might be able to reach a large and not yet radicalized audience. We constructed a critical multimodal and multidisciplinary framework of discourse patterns that may work as potential triggers to a selection of key cognitive biases and we applied it to a corpus of Facebook posts published by seven extremist Salafists. Results reveal how these posts are either based on an intense crisis construct (through negative outgroup nomination, intensification and emotion) or on simplistic solutions composed of taken-for-granted statements. Devoid of any grey zone, these posts do not seek to convince the reader; polarization is framed as a presuppositional established reality. These observations reveal that extremist Salafist communication is constructed in a way that may trigger specific cognitive biases, which are discussed in the paper.
|
2021 |
Bouko, C., Naderer, B., Rieger, D., Van Ostaeyen, P. and Voué, P. |
View
Publisher
|
Journal Article |
Facebook’s policies against extremism: Ten years of struggle for more transparency
View Abstract
For years, social media, including Facebook, have been criticized for lacking transparency in their community standards, especially in terms of extremist content. Yet, moderation is not an easy task, especially when extreme-right actors use content strategies that shift the Overton window (i.e., the range of ideas acceptable in public discourse) rightward. In a self-proclaimed search of more transparency, Facebook created its Transparency Center in May 2021. It also has regularly updated its community standards, and Facebook Oversight Board has reviewed these standards based on concrete cases, published since January 2021. In this paper, we highlight how some longstanding issues regarding Facebook’s lack of transparency still remain unaddressed in Facebook’s 2021 community standards, mainly in terms of the visual ‘representation’ of and endorsement from dangerous organizations and individuals. Furthermore, we also reveal how the Board’s no-access to Facebook’s in-house rules exemplifies how the longstanding discrepancy between the public and the confidential levels of Facebook policies remains a current issue that might turn the Board’s work into a mere PR effort. In seeming to take as many steps toward shielding some information as it has toward exposing others to the sunshine, Facebook’s efforts might turn out to be transparency theater.
|
2021 |
Bouko, C., Van Ostaeyen, P. and Voué, P. |
View
Publisher
|
Journal Article |
Exploring “Stormfront”: A Virtual Community of the Radical Right
View Abstract
In considering how terrorist movements use the Internet, it is becoming increasingly apparent that we must move beyond predominantly descriptive overviews of the contents of websites to examine in more detail the notion of virtual communities of support and the functions of these for their members. Virtual communities in support of terrorist movements are real social spaces where people interact on a regular basis to disseminate their views, share their knowledge, and encourage each other to become increasingly supportive of movements that use terrorism to achieve their goals. Taken from a larger body of comparative qualitative research investigating the content and function of discourses created in virtual communities in support of terrorism, this article presents a thematic analysis of “Stormfront,” a virtual community of the radical right.
|
2009 |
Bowman-Grieve, L. |
View
Publisher
|
Journal |
Exploring Stormfront: A Virtual Community of the Radical Right
View Abstract
In considering how terrorist movements use the Internet, it is becoming increasingly apparent that we must move beyond predominantly descriptive overviews of the contents of websites to examine in more detail the notion of virtual communities of support and the functions of these for their members. Virtual communities in support of terrorist movements are real social spaces where people interact on a regular basis to disseminate their views, share their knowledge, and encourage each other to become increasingly supportive of movements that use terrorism to achieve their goals. Taken from a larger body of comparative qualitative research investigating the content and function of discourses created in virtual communities in support of terrorism, this article presents a thematic analysis of “Stormfront,” a virtual community of the radical right.
|
2009 |
Bowman-Grieve, L. |
View
Publisher
|