Journal Article |
Beyond Black and White: the Intersection of Ideologies in Online Extremist Communities
View Abstract
Current literature on online criminal and deviant groups recognises the role of online forums in the transfer of knowledge and socialisation of members, but current research lacks insight on the evolution and convergence of these groups. One area of concerns is how different aspects of these ideologies, most notably misogyny, anti-semitism and racism, are shared and developed between communities making up the manosphere and those dedicated to far-right themes. Current research has found overlaps in memberships across these two online groups, with growing evidence showing members’ linkage to online harassment and offline violent incidents (Farrell et al., ; Regehr,). To develop appropriate interventions to prevent such violent events, this research attempts to elucidate the different elements of the ideologies expressed in online communities known collectively as the “manosphere”, by analysing the Cambridge Cybercrime Centre’s ExtremeBB dataset. This database includes approximately 46 million posts made by more than 315 thousand registered active members on 12 different online extremist forums promoting misogyny and far-right extremism. To understand the interaction between far-right extremism and misogyny, we perform a qualitative analysis of a selection of posts already categorised by topic. Preliminary analyses show support for the following aspects: (a) similarities in radicalisation mechanisms, and (b) overlaps in the discourse on race and gender. These similarities provide potential gateways for previously isolated members to venture beyond their current association, suggesting the further adoption of extreme ideologies. Such a process, known as radicalisation, is highly correlated with extremism and terrorism (Borum, ; McCauley & Moskalenko,). Findings from this research will allow for more precise interventions.
|
2023 |
Chua, Y.T. and Wilson, L. |
View
Publisher
|
Journal Article |
Crowdsourcing geographic information for terrorism-related disaster awareness and mitigation: perspectives and challenges
View Abstract
This systematic review explores the utilization of crowdsourcing for geoinformation in enhancing awareness and mitigating terrorism-related disasters. Out of 519 studies identified in the database search, 108 were deemed eligible for analysis. We focused on articles employing various forms of crowdsourcing platforms, such as Twitter (now known as X), Facebook, and Telegram, across three distinct phases of terrorism-related disasters: monitoring and detection, onset, and post-incident analysis. Notably, we placed particular emphasis on the integration of Machine Learning (ML) algorithms in studying crowdsourced terrorism geoinformation to assess the current state of research and propose future directions. The findings revealed that Twitter emerged as the predominant crowdsourcing platform for terrorism-related information. Despite the prevalence of natural language processing for data mining, the majority of studies did not incorporate ML algorithms in their analyses. This preference for qualitative research methods can be attributed to the multifaceted nature of terrorism, spanning security, governance, politics, religion, and law. Our advocacy is for increased studies from the domains of geography, earth observation, and big data. Simultaneously, we encourage advancements in existing ML algorithms to enhance the accurate real-time detection of planned and onset terrorism disasters.
|
2025 |
Chukwu, M., Huang, X., Wang, S., Yang, D. and Ye, X. |
View
Publisher
|
Policy |
What to Do about the Emerging Threat of Censorship Creep on the Internet
View Abstract
Popular tech companies—Google, Facebook,
Twitter, and others—have strongly protected
free speech online, a policy widely associated
with the legal norms of the United States.
American tech companies, however, operate
globally, and their platforms are subject to regulation
by the European Union, whose member states offer less
protection to expression than does the United States.
European regulators are pressuring tech companies to
control and suppress extreme speech. The regulators’
clear warning is that, if the companies do not comply
“voluntarily,” they will face harsher laws and potential
liability. This regulatory effort runs the risk of censorship
creep, whereby a wide array of protected speech, including
political criticism and newsworthy content, may end up
being removed from online platforms on a global scale.
|
2017 |
Citron, D.K. |
View
Publisher
|
Letter |
Open letter on behalf of civil society groups regarding the proposal for a Regulation on Terrorist Content Online
View Abstract
The undersigned human rights and digital rights organizations call on the participants of the trialogue meeting on the Proposal for a Regulation of the European Parliament and of Council on preventing/addressing the dissemination of terrorist content online to comply with the Charter of Fundamental Rights and discuss further amendments that fully respect freedom of expression, freedom of information and personal data protection of internet users.
|
2020 |
Civil Liberties Union for Europe |
View
Publisher
|
Journal Article |
Not that lonely! assessing the “socialization” role of online environment in the radicalization process of lone wolves
View Abstract
This study examines lone wolf attacks, distinct aspect of terrorism, through contemporary dynamics. Broadly, lone wolves are defined as individuals who experience their radicalization processes independently and are often associated with self-radicalization. However, it would be inaccurate to assume that lone wolves are entirely isolated from social processes. Although they do not act on directives when planning and carrying out their attacks, it is essential to recognize their interactions with others. In particular, the relationships they establish during their radicalization process, alongside personal motivations, are significant. The study explores the idea that traditional group dynamics crucial to radicalization are, for lone wolves, facilitated through online environments. The argument that online platforms serve as a socialization milieu for radicalization forms the core contribution and purpose of this research. Accordingly, the study analyzes how online activities influence the radicalization of lone wolves and evaluates strategies to counteract this phenomenon. Lastly, it engages in a discussion on the relationship between lone wolf terrorism and traditional organizational terrorism.
|
2025 |
Çıtak, E. |
View
Publisher
|
Journal Article |
Behind Blue Skies: A Multimodal Automated Content Analysis of Islamic Extremist Propaganda on Instagram
View Abstract
Social media platforms, such as Instagram, are regularly misused for spreading covert (Islamic) extremist propaganda. Affect and emotion are central tools used in extremist propaganda, but there is little research into the combined employment of different social media elements, such as hashtags, visuals, and texts, in the context of propaganda. This study contributes to closing this gap. Using the German group Generation Islam as a case study, we examined the group’s Instagram activity (N = 1,187 posts) over the course of 2 years. To reflect the platform users’ logic, we (a) examined affect in hashtag networks in which users can come across propagandistic content, (b) employed deep learning to examine the emotional valence transmitted in the visuals, and (c) used automated linguistic analysis to describe collective action cues contained within the texts. The results are novel, as they provide nuanced insights into extremist propaganda’s employment of affect and emotions across Instagram’s affordances.
|
2023 |
Clever, L., Schatto-Eckrodt, T., Clever, N.C. and Frischlich, L. |
View
Publisher
|