Journal Article |
Cyber-routines, Political Attitudes, and Exposure to Violence-Advocating Online Extremism
View Abstract
The Internet’s relatively unfettered transmission of information risks exposing individuals to extremist content. Using online survey data (N = 768) of American youth and young adults, we examine factors that bring individuals into contact with online material advocating violence. Combining aspects of social structure-social learning theory with insights from routine activity theory, we find that exposure to violence-advocating materials is positively correlated with online behaviors, including the use of social media platforms and the virtual spaces individuals frequent. Target antagonism is also correlated with exposure to violence-advocating materials, but guardianship and online and offline associations are not. Finally, feelings of dissatisfaction with major social institutions and economic disengagement are associated with exposure to violent materials online.
|
2019 |
Hawdon, J., Bernatzky, C. and Costello, M. |
View
Publisher
|
Journal Article |
Prototype and Analytics for Discovery and Exploitation of Threat Networks on Social Media
View Abstract
Identifying and profiling threat actors are high priority tasks for a number of governmental organizations. These threat actors may operate actively, using the Internet to promote propaganda, recruit new members, or exert command and control over their networks. Alternatively, threat actors may operate passively, demonstrating operational security awareness online while using their Internet presence to gather information they need to pose an offline physical threat. This paper presents a flexible new prototype system that allows analysts to automatically detect, monitor and characterize threat actors and their networks using publicly available information. The proposed prototype system fills a need in the intelligence community for a capability to automate manual construction and analysis of online threat networks. Leveraging graph sampling approaches, we perform targeted data collection of extremist social media accounts and their networks. We design and incorporate new algorithms for role classification and radicalization detection using insights from social science literature of extremism. Additionally, we develop and implement analytics to facilitate monitoring the dynamic social networks over time. The prototype also incorporates several novel machine learning algorithms for threat actor discovery and characterization, such as classification of user posts into discourse categories, user post summaries and gender prediction.
|
2019 |
Simek, O., Shah, D. and Heier, A. |
View
Publisher
|
Journal Article |
Challenges and Frontiers in Abusive Content Detection
View Abstract
Online abusive content detection is an inherently difficult task. It has received considerable attention from academia, particularly within the computational linguistics community, and performance appears to have improved as the field has matured. However, considerable challenges and unaddressed frontiers remain, spanning technical, social and ethical dimensions. These issues constrain the performance, efficiency and generalizability of abusive content detection systems. In this article we delineate and clarify the main challenges and frontiers in the field, critically evaluate their implications and discuss potential solutions. We also highlight ways in which social scientific insights can advance research. We discuss the lack of support given to researchers working with abusive content and provide guidelines for ethical research.
|
2019 |
Vidgen, B., Harris, A., Nguyen, D., Tromble, R., Hale, S. and Margetts, H. |
View
Publisher
|
Report |
Hate Speech and Radicalisation Online The OCCI Research Report
View Abstract
The research series Hate Speech and Radicalisation on the Internet provides interdisciplinary insights into the current developments of extremist activities on the internet. With the aid of expert contributions from all over Germany, the psychological, political, anthropological and technological aspects of online hate speech and radicalisation will be considered and recommendations will be made for political leaders, social media platforms as well as NGOs and activists.
|
2019 |
Baldauf, J., Ebner, J. and Guhl, J. (Eds.) |
View
Publisher
|
Journal Article |
Misogynistic Men Online: How the Red Pill Helped Elect Trump
View Abstract
Donald Trump’s 2016 electoral victory was a shock for feminist scholars, yet it was no surprise to his legion of supporters in alt-right digital spaces. In this essay, we analyze one of the online forums that helped propel Trump to electoral victory. Drawing on social movement concepts and an analysis of 1,762 posts, we show how leaders of the forum the “Red Pill” were able to move a community of adherents from understanding men’s rights as a personal philosophy to political action. This transition was no small endeavor. The Red Pill forum was explicitly apolitical until the summer before the 2016 election. During the election, forum leaders linked the forum’s neoliberal, misogynistic collective identity of alpha masculinity to Trump’s public persona and framed his political ascendance as an opportunity to effectively push back against feminism and get a “real” man into the White House. We argue that while previous research shows the importance of alt-right virtual spaces in creating and maintaining racist collective identities, we know very little about how men conceptualize gender in ways that inform their personal and political action—and this is to our detriment. We conclude the essay by arguing that feminists need to understand how men cultivate extreme personal and political identities in online forums so that we can better understand how new technologies are used to move individuals from the armchair to the streets.
|
2019 |
Dignam, A.P. and Rohlinger, D.A. |
View
Publisher
|
Journal Article |
“Deplorable” Satire: Alt-Right Memes, White Genocide Tweets, and Redpilling Normies.
View Abstract
In the past decade, people associated with what is known as the alt-right have employed a strategy similar to that of progressive, antiracist satirists to advance a decidedly white supremacist, anti-Semitic, misogynist, and deadly serious agenda. As this article documents, the alt-right weaponizes irony to attract and radicalize potential supporters, challenge progressive ideologies and institutions, redpill normies, and create a toxic counterpublic. Discussing examples of satiric irony generated by the extreme right alongside those produced by the (often mainstream) left, this article pairs two satirical memes, two activists’ use of irony, two ambiguously satirical tweets, and two recent controversies pertaining to racism and satire so as to illustrate how people with very different political commitments employ a similar style with potent effects. Of particular significance are reverse racism discourses, including “white genocide,” and the increasingly complicated relationship between intentions, extremism, and satire.
|
2019 |
Greene, V.S. |
View
Publisher
|