Journal Article |
“You Know What to Do”: Proactive Detection of YouTube Videos Targeted by Coordinated Hate Attacks
View Abstract
Video sharing platforms like YouTube are increasingly targeted by aggression and hate attacks. Prior work has shown how these attacks often take place as a result of “raids,” i.e., organized efforts by ad-hoc mobs coordinating from third-party communities. Despite the increasing relevance of this phenomenon, however, online services often lack effective countermeasures to mitigate it. Unlike well-studied problems like spam and phishing, coordinated aggressive behavior both targets and is perpetrated by humans, making defense mechanisms that look for automated activity unsuitable. Therefore, the de-facto solution is to reactively rely on user reports and human moderation. In this paper, we propose an automated solution to identify YouTube videos that are likely to be targeted by coordinated harassers from fringe communities like 4chan. First, we characterize and model YouTube videos along several axes (metadata, audio transcripts, thumbnails) based on a ground truth dataset of videos that were targeted by raids. Then, we use an ensemble of classifiers to determine the likelihood that a video will be raided with very good results (AUC up to 94%). Overall, our work provides an important first step towards deploying proactive systems to detect and mitigate coordinated hate attacks on platforms like YouTube.
|
2019 |
Mariconti, E., Suarez-Tangil, G., Blackburn, J., de Cristofaro, E., Kourtellis, N., Leontiadis, I., Serrano, J.L. and Stringhini, G. |
View
Publisher
|
Book |
Islamic State’s Online Activity And Responses
View Abstract
‘Islamic State’s Online Activity and Responses’ provides a unique examination of Islamic State’s online activity at the peak of its “golden age” between 2014 and 2017 and evaluates some of the principal responses to this phenomenon. Featuring contributions from experts across a range of disciplines, the volume examines a variety of aspects of IS’s online activity, including their strategic objectives, the content and nature of their magazines and videos, and their online targeting of females and depiction of children. It also details and analyses responses to IS’s online activity – from content moderation and account suspensions to informal counter-messaging and disrupting terrorist financing – and explores the possible impact of technological developments, such as decentralised and peer-to-peer networks, going forward. Platforms discussed include dedicated jihadi forums, major social media sites such as Facebook, Twitter, and YouTube, and newer services, including Twister.
‘Islamic State’s Online Activity and Responses’ is essential reading for researchers, students, policymakers, and all those interested in the contemporary challenges posed by online terrorist propaganda and radicalisation. The chapters were originally published as a special issue of Studies in Conflict & Terrorism.
|
2019 |
Conway, M. and Macdonald, S. |
View
Publisher
|
Journal Article |
Weaponizing white thymos: flows of rage in the online audiences of the alt-right
View Abstract
The alt-right is a growing radical right-wing network that is particularly effective at mobilizing emotion through digital communications. Introducing ‘white thymos’ as a framework to theorize the role of rage, anger, and indignation in alt-right communications, this study argues that emotive communication connects alt-right users and mobilizes white thymos to the benefit of populist radical right politics. By combining linguistic, computational, and interpretive techniques on data collected from Twitter, this study demonstrates that the alt-right weaponizes white thymos in three ways: visual documentation of white victimization, processes of legitimization of racialized pride, and reinforcement of the rectitude of rage and indignation. The weaponization of white thymos is then shown to be central to the culture of the alt-right and its connectivity with populist radical right politics.
|
2020 |
Ganesh, B. |
View
Publisher
|
Report |
Digital Jihad: Online Communication and Violent Extremism
View Abstract
The internet offers tremendous opportunities for violent extremists across the ideological spectrum and at a global level. In addition to propaganda, digital technologies have transformed the dynamics of radical mobilisation, recruitment and participation. Even though the jihadist threat has seemingly declined in the West, the danger exists of the internet being an environment where radical messages can survive and even prosper. Against this background, this ISPI report investigates the current landscape of jihadist online communication, including original empirical analysis. Specific attention is also placed on potential measures and initiatives to address the threat of online violent extremism. The volume aims to present important points for reflection on the phenomenon in the West (including Italy) and beyond.
|
2019 |
Marone, F. (Ed.) |
View
Publisher
|
Journal Article |
Understanding the Incel Community on YouTube
View Abstract
YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform also hosts inappropriate, toxic, and/or hateful content. One community that has come into the spotlight for sharing and publishing hateful content are the so-called Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men’s issues, who have often been linked to misogynistic views. In this paper, we set out to analyze the Incel community on YouTube. We collect videos shared on Incel-related communities within Reddit, and perform a data-driven characterization of the content posted on YouTube along several axes. Among other things, we find that the Incel community on YouTube is growing rapidly, that they post a substantial number of negative comments, and that they discuss a broad range of topics ranging from ideology, e.g., around the Men Going Their Own Way movement, to discussions filled with racism and/or misogyny. Finally, we quantify the probability that a user will encounter an Incel-related video by virtue of YouTube’s recommendation algorithm. Within five hops when starting from a non-Incel-related video, this probability is 1 in 5, which is alarmingly high given the toxicity of said content.
|
2020 |
Papadamou, K., Zannettou, S., Blackburn, J., De Cristofaro, E., Stringhini, G. and Sirivianos, M. |
View
Publisher
|
Journal Article |
Extreme Speech Online: An Anthropological Critique of Hate Speech Debates
View Abstract
Exploring the cases of India and Ethiopia, this article develops the concept of “extreme speech” to critically analyze the cultures of vitriolic exchange on Internet-enabled media. While online abuse is largely understood as “hate speech,” we make two interventions to problematize the presuppositions of this widely invoked concept. First, extreme speech emphasizes the need to contextualize online debate with an attention to user practices and particular histories of speech cultures. Second, related to context, is the ambiguity of online vitriol, which defies a simple antonymous conception of hate speech versus acceptable speech. The article advances this analysis using the approach of “comparative practice,” which, we suggest, complicates the discourse of Internet “risk” increasingly invoked to legitimate online speech restrictions.
|
2017 |
Pohjonen, M. and Udupa, S. |
View
Publisher
|