VOX-Pol Blog |
Challenges of Using Twitter as a Data Source: An Overview of Current Resources
View Abstract
|
2016 |
Ahmed, W. |
View
Publisher
|
Journal Article |
Challenges of Deplatforming Extremist Online Movements: A Machine-Learning Approach
View Abstract
Online extremist movements are increasingly using social media communities to share content, spread their ideologies, recruit members, and mobilize offline activities. In recent years, mainstream platforms, including Twitter and Facebook, have adopted policies to remove or deplatform some of these movements. Yet online extremists are well-known for their abilities to adapt, self-censor, and migrate across online platforms. How successful have these extremist movement deplatformings been? To answer this question, we begin by training a classifier to identify content generated by four prominent extremist movements: white supremacists, patriot/militia groups, QAnon, and Boogaloos. After doing so, we use this classifier to analyze approximately 12 million posts generated by about 1500 online hate communities across 8 social media platforms, including both mainstream and alternative platforms. We find that the deplatformings of Boogaloos and QAnon by mainstream platforms were initially highly successful, but that both movements were able to find ways to re-introduce their content on these platforms. These findings highlight the challenges of movement-based deplatforming, and they point toward important implications for content moderation.
|
2023 |
Lupu, Y., Sear, R., Restrepo, N.J., Velásquez, N., Leahy, R., Goldberg, B. and Johnson, N.F. |
View
Publisher
|
Journal Article |
Challenges and Frontiers in Abusive Content Detection
View Abstract
Online abusive content detection is an inherently difficult task. It has received considerable attention from academia, particularly within the computational linguistics community, and performance appears to have improved as the field has matured. However, considerable challenges and unaddressed frontiers remain, spanning technical, social and ethical dimensions. These issues constrain the performance, efficiency and generalizability of abusive content detection systems. In this article we delineate and clarify the main challenges and frontiers in the field, critically evaluate their implications and discuss potential solutions. We also highlight ways in which social scientific insights can advance research. We discuss the lack of support given to researchers working with abusive content and provide guidelines for ethical research.
|
2019 |
Vidgen, B., Harris, A., Nguyen, D., Tromble, R., Hale, S. and Margetts, H. |
View
Publisher
|
Journal Article |
Censoring Extremism: Influence of Online Restriction on Official Media Products of ISIS
View Abstract
Recognizing that militant, non-state groups utilize social media and online platforms to reach members, sympathizers, and potential recruits, state agencies and social media corporations now increasingly regulate access to accounts affiliated with such groups. Scholars examining deplatforming efforts have, to date, focused on the extent of audience loss after account restrictions and the identification of strategies for regrouping online followers on the same or different platforms over time. Left unexplored is if and how militant non-state groups adapt their official messaging strategies in response to platform restrictions despite continuing online access to them. To begin to fill that gap, this study compares ISIS’s 550 images displayed in the group’s official newsletter al-Naba 6 months before and after Europol’s November 2019 take-down of terrorist affiliated accounts, groups, channels, and bots on Telegram. It conducts a content analysis of images related to militaries and their outcomes, non-military activities and their outcomes, and presentational forms. The findings demonstrate that ISIS visually emphasizes its standard priming approach but shifts its agenda-setting strategy. While retaining some of its standard visual framing practices, the group also alters frames, particularly those related to images showing opposing militaries and military outcome.
|
2021 |
McMinimy, K., Winkler, C.K., Lokmanoglu, A.D. and Almahmoud, M. |
View
Publisher
|
Report |
Caught In The Net: The Impact Of “Extremist” Speech Regulations On Human Rights Content
View Abstract
Social media companies have long struggled with what to do about extremist content on their platforms. While most companies include provisions about “extremist” content in their community standards, until recently, such content was often vaguely defined, providing policymakers and content moderators a wide berth in determining what to remove, and what to allow. Unfortunately, companies have responded with overbroad and vague policies and practices that have led to mistakes at scale that are decimating human rights content.
|
2019 |
Jaloud, A. R. A., Al Khatib, K., Deutch, J., Kayyali, D. and York, J. C. |
View
Publisher
|
Journal Article |
Catch 22: Institutional ethics and researcher welfare within online extremism and terrorism research
View Abstract
Drawing from interviews with 39 online extremism and terrorism researchers, this article provides an empirical analysis of these researchers’ experiences with institutional ethics processes. Discussed are the harms that these researchers face in the course of their work, including trolling, doxing, and mental and emotional trauma arising from exposure to terrorist content, which highlight the need for an emphasis on researcher welfare. We find that researcher welfare is a neglected aspect of ethics review processes however, with most interviewees not required to gain ethics approval for their research resulting in very little attention to researcher welfare issues. Interviewees were frustrated with ethics processes, indicating that committees oftentimes lacked the requisite knowledge to make informed ethical decisions. Highlighted by interviewees too was a concern that greater emphasis on researcher welfare could result in blockages to their ‘risky’ research, creating a ‘Catch 22’: interviewees would like more emphasis on their (and colleagues’) welfare and provision of concomitant supports, but feel that increased oversight would make gaining ethics approval for their research more difficult, or even impossible. We offer suggestions for breaking the impasse, including more interactions between ethics committees and researchers; development of tailored guidelines; and more case studies reflecting on ethics processes.
|
2025 |
Whittaker, J., Pearson, E., Mattheis, A.A., Baaken, T., Zeiger, S., Atamuradova, F. and Conway, M. |
View
Publisher
|