Report |
Recommending Toxicity: The role of algorithmic recommender functions on YouTube Shorts and TikTok in promoting male supremacist influencers
View Abstract
This study tracked, recorded and coded the content recommended to 10 experimental or ‘sockpuppet’ accounts on 10 blank smartphones, 5 on YouTube Shorts and 5 on TikTok. On each platform, we set up 5 types of accounts: one 16-year old boy and one 18-year old boy who sought out content typically associated with gender-normative young men (e.g. gym content, sports, video games), one 16-year old boy and one 18-year old boy who actively sought out content associated with the manosphere (e.g. Andrew Tate, anti-feminist), and one blank control account that did not deliberately seek out or engage with any particular content. The purpose of this research was to simulate and explore the digital reality of boys and young men using TikTok and YouTube Shorts, who are most likely to be targeted by the manosphere.
|
2024 |
Baker, C., Ging, D. and Andreasen, M.B. |
View
Publisher
|
Journal Article |
Recommender systems and the amplification of extremist content
View Abstract
Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming “filter bubbles”. This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms’ recommendation systems when interacting with far-right content. We find that one platform—YouTube—does amplify extreme and fringe content, while two—Reddit and Gab—do not. Secondly, we contextualise these findings into the regulatory debate. There are currently few policy instruments for dealing with algorithmic amplification, and those that do exist largely focus on transparency. We argue that policymakers have yet to fully understand the problems inherent in “de-amplifying” legal, borderline content and argue that a co-regulatory approach may offer a route towards tackling many of these challenges.
|
2021 |
Whittaker, J., Looney, S., Reed, A. and Votta, F. |
View
Publisher
|
VOX-Pol Blog |
Reclaiming Our Narratives: A Needs-Based Approach to Countering Extremist Disinformation Online
View Abstract
|
2025 |
Kruglova, A. and White, B. |
View
Publisher
|
Journal Article |
Reclaim the Beach: How Offline Events Shape Online Interactions and Networks Amongst Those Who Support and Oppose Right-Wing Protest
View Abstract
In this paper we examine how offline protests attended by members of the Australian far-right shape online interactions. Tweets about the 2019 St Kilda beach rally were collected. Users were manually classified as supporters (n = 104) or opponents of the rally (n = 872). Network analysis demonstrated that interactions between the two groups increased at the time of the rally. Natural language processing showed that both groups became angrier and used more “othering” language during the rally. However, there were stark differences in the moral worldviews, highlighting the very different moral positions that underpin engagement with, and opposition to, the far-right agenda.
|
2022 |
Thomas, E.F., Leggett, N., Kernot, D., Mitchell, L., Magsarjav, S. and Weber, N. |
View
Publisher
|
Report |
Rechtsterrorismus im digitalen Zeitalter
View Abstract
Der Rechtsterrorismus ist im digitalen Zeitalter angekommen. Von Christchurch bis El Paso haben sich neue Ausdrucksformen rechter Gewalt etabliert, deren Täter mehr in digitalen Subkulturen als in rechtsextremen Organisationen zu verorten sind. Die radikalisierenden Tendenzen obskurer Online-Communitys geraten somit stärker in den Fokus der Forschung und fordern das Verständnis von rechtem Terror heraus. Wie verändert sich der Rechtsterrorismus also im digitalen Zeitalter? Mit diesem Beitrag möchten wir diese Frage mit dem Verweis auf die Beziehung von digitalen Hasskulturen und rechtsterroristischer Gewalt beleuchten. Wir argumentieren, dass die Analyse der Gewalttaten nicht ohne das Verständnis digitaler Hasskulturen auskommt, die Menschenfeindlichkeit über ironische Kommunikationsformate normalisiert. Aus ihnen heraus bildet sich eine rechtsterroristische Subkultur, die die ambivalenten Erzeugnisse digitaler Kulturen aufgreift und mit gewaltverherrlichenden Inhalten des Neonazismus verbindet, um eines zu erreichen: Menschen zur Gewalt anzuspornen.
|
2020 |
Albrecht, S. and Fielitz, M. |
View
Publisher
|
Report |
Reception and Perception of Radical Messages
View Abstract
This report represents a first contribution by the Samir Kassir Foundation (SKF) to the ongoing and growing debate on the role of communication in the radicalisation process and the mechanisms to prevent or counter violent extremism (CVE). The primary focus of this research is communication by and about the Islamic State and did not include communication by and about militant Islamist organisations from other ideological and sectarian backgrounds. It is based on qualitative opinion and media consumption research conducted in February and March 2016 with Lebanese audiences in Tripoli, North Lebanon, West Bekaa and among Syrian refugees with the support of the Ministry of Foreign Affairs of the Netherlands under contract No. 28141. The project was implemented by a steering committee led by academic and policy consultant Drew Mikhael and comprised of SKF Executive Director Ayman Mhanna, SKF Programs Coordinator Nassim AbiGhanem, academic and senior researcher Nidal Ayoub and social media communication specialist Marie-Thérèse Corbani. The contents of this report are the sole responsibility of the Samir Kassir Foundation and can in no way be taken to reflect the views of the Ministry of Foreign Affairs of the Netherlands
|
2016 |
Mikhael, D., Mhanna, A., Ayoub, N., AbiGhanem, N., and Corbani, M |
View
Publisher
|