Journal Article |
Politicization and Right-Wing Normalization on YouTube: A Topic-Based Analysis of the “Alternative Influence Network”
View Abstract
Scholarship has highlighted the rise of political influencer networks on YouTube, raising concerns about the platform’s propensity to spread and even incentivize politically extreme content. While many studies have focused on YouTube’s algorithmic infrastructure, limited research exists on the actual content in these networks. Building on Lewis’s (2018) classification of an “alternative influencer” network, we apply structural topic modeling across all text-based autocaptions from her study’s sample to identify common topics featured on these channels. This allows us to gauge which topics appear together and to trace politicization over time. Through network analysis, we determine channel similarities and evaluate whether deplatformed channels influenced topic shifts. We find that political topics increasingly dominate the focus of all analyzed channels. The convergence of culture and politics occurs mostly about identity-driven issues. Furthermore, more extreme channels do not form distinct clusters but blend into the larger content-based network. Our findings illustrate how political topics may function as connective ties across an initially more diverse network of YouTube influencer channels.
|
2023 |
Knüpfer, C.B., Schwemmer, C. and Heft, A. |
View
Publisher
|
Chapter |
The Future of Counterspeech: Effective Framing, Targeting, and Evaluation
View Abstract
Approaches for strategically countering or providing alternatives to hate speech and extremism online have evolved substantively in the last ten years. Technological advancement and a generation of young activists who have been socialized as digital natives have facilitated a maelstrom of both hate-based extremist content and attempts to counter this material in different guises and through diverse channels. The rate and pace of change within the tech sector, and social media growth in particular, have meant that although counterspeech is now more prevalent than ever before, it requires greater guidance and more robust public–private partnerships to effectively prevent and counter extremism online. The chapter embraces a cross-platform and international overview of some of the best practices within efforts to prevent and counter violent extremism online and discusses the future of counterspeech with recommendations for expanded innovation and partnership models.
|
2023 |
Saltman, E. and Zamir, M. |
View
Publisher
|
Journal Article |
Auditing Elon Musk’s Impact on Hate Speech and Bots
View Abstract
On October 27th, 2022, Elon Musk purchased Twitter, becoming its new CEO and firing many top executives in the process. Musk listed fewer restrictions on content moderation and removal of spam bots among his goals for the platform. Given findings of prior research on moderation and hate speech in online communities, the promise of less strict content moderation poses the concern that hate will rise on Twitter. We examine the levels of hate speech and prevalence of bots before and after Musk’s acquisition of the platform. We find that hate speech rose dramatically upon Musk purchasing Twitter and the prevalence of most types of bots increased, while the prevalence of astroturf bots decreased.
|
2023 |
Hickey, D., Schmitz, M., Fessler, D., Smaldino, P.E., Muric, G. and Burghardt, K. |
View
Publisher
|
Journal Article |
Moderating borderline content while respecting fundamental values
View Abstract
As efforts to identify and remove online terrorist and violent extremist content have intensified, concern has also grown about so‐called lawful but awful content. Various options have been touted for reducing the visibility of this borderline content, including removing it from search and recommendation algorithms, downranking it and redirecting those who search for it. This article contributes to this discussion by considering the moderation of such content, in terms of three sets of values. First, definitional clarity. This is necessary to provide users with fair warning of what content is liable to moderation and to place limits on the discretion of content moderators. Yet, at present, definitions of borderline content are vague and imprecise. Second, necessity and proportionality. While downranking and removal from search and recommender algorithms should be distinguished from deplatforming, tech companies’ efforts to deamplify borderline content give rise to many of the same concerns as content removal and account shutdowns. Third, transparency. While a number of platforms now publish their content moderation policies and transparency data reports, these largely focus on violative, not borderline content. Moreover, there remain questions around access to data for independent researchers and transparency at the level of the individual user.
|
2023 |
Macdonald, S. and Vaughan, K. |
View
Publisher
|
Journal Article |
Mapping a Dark Space: Challenges in Sampling and Classifying Non-Institutionalized Actors on Telegram
View Abstract
Crafted as an open communication platform characterized by high anonymity and minimal moderation, Telegram has garnered increasing popularity among activists operating within repressive political contexts, as well as among political extremists and conspiracy theorists. While Telegram offers valuable data access to research non-institutionalized activism, scholars studying the latter on Telegram face unique theoretical and methodological challenges in systematically defining, selecting, sampling, and classifying relevant actors and content. This literature review addresses these issues by considering a wide range of recent research. In particular, it discusses the methodological challenges of sampling and classifying heterogeneous groups of (often non-institutionalized) actors. Drawing on social movement research, we first identify challenges specific to the characteristics of non-institutionalized actors and how they become interlaced with Telegram’s platform infrastructure and requirements. We then discuss strategies from previous Telegram research for the identification and sampling of a study population through multistage sampling procedures and the classification of actors. Finally, we derive challenges and potential strategies for future research and discuss ethical challenges.
|
2023 |
Jost, P., Heft, A., Buehling, K., Zehring, M., Schulze, H., Bitzmann, H. and Domahidi, E. |
View
Publisher
|
Journal Article |
Media and terrorism in Africa: Al-Shabaab’s evolution from militant group to media Mogul
View Abstract
It is no surprise to come across information or video on social or mainstream media that was posted by a terrorist organisation like Al-Shabaab. In this regard, researchers have attempted to answer the question of what terrorist organisations aim to achieve by gaining a strong foothold in cyberspace. This article explores the evolution of Al-Shabaab in terms of their media usage and presence—from a local insurgency using magazines and radio stations, to what can be described as a media mogul in Africa. The author explores how and why this group chooses to pursue a strong cyber presence, and what, if anything, Africa and the international community can do about it.
|
2023 |
Grobbelaar, A. |
View
Publisher
|