Report |
Hate Crime: Abuse, Hate and Extremism Online
View Abstract
We announced this inquiry into hate crime and its violent consequences in early July 2016. Our decision to undertake the inquiry followed the murder of Jo Cox MP in June in the lead-up to the EU referendum. There was also evidence of an increase in the number of attacks on people from ethnic minorities and of non-British nationality, including on their community centres and places of worship, immediately following the referendum. In addition, our inquiry into antisemitism was already under way, which was raising serious questions about how to address wider issues around the actions of those holding extremist or fixated views. It therefore seemed particularly timely and necessary to launch this
inquiry. We have received a large volume of written evidence. We have taken oral evidence on a wide range of issues including Islamophobia, misogyny, far-right extremism, the role of social media in hate crime and the particular issues faced by Members of Parliament in relation to hate crime and its violent manifestations. Our witnesses have included
academics, community organisations, social media companies, police forces and their representative organisations, the principal Deputy Speaker of the House of Commons, and Ministers. We are grateful to everyone who has contributed to the inquiry.
|
2017 |
House of Commons, Home Affairs Committee |
View
Publisher
|
Journal Article |
Evaluating the scale, growth, and origins of right-wing echo chambers on YouTube
View Abstract
Although it is understudied relative to other social media platforms, YouTube is arguably the largest and most engaging online media consumption platform in the world. Recently, YouTube’s outsize influence has sparked concerns that its recommendation algorithm systematically directs users to radical right-wing content. Here we investigate these concerns with large scale longitudinal data of individuals’ browsing behavior spanning January 2016 through December 2019. Consistent with previous work, we find that political news content accounts for a relatively small fraction (11%) of consumption on YouTube, and is dominated by mainstream and largely centrist sources. However, we also find evidence for a small but growing “echo chamber” of far-right content consumption. Users in this community show higher engagement and greater “stickiness” than users who consume any other category of content. Moreover, YouTube accounts for an increasing fraction of these users’ overall online news consumption. Finally, while the size, intensity, and growth of this echo chamber present real concerns, we find no evidence that they are caused by YouTube recommendations. Rather, consumption of radical content on YouTube appears to reflect broader patterns of news consumption across the web. Our results emphasize the importance of measuring consumption directly rather than inferring it from recommendations.
|
2020 |
Hosseinmardi, H., Ghasemian, A., Clauset, A., Rothschild, D.M., Mobius, M. and Watts, D.J. |
View
Publisher
|
Journal Article |
Understanding Online Platform Usage of Extremist Groups via Graph Analytics
View Abstract
Graph analytics has become instrumental in uncovering insights across various domains, specifically in social networks. It serves as a crucial tool for analyzing the relationship between users in different online platforms. In this research, we apply methods of social network analysis to examine the communication patterns among participants in an online forum recognized for far-right extremism. Our study demonstrates the actors’ relationships and activities through different aspects of applications over networks. In extensive analysis, we identify the influential actors and map their relationships throughout the course of 76 monthly networks. Moreover, we illustrate the evolution of networks over that period, and their connections with significant events. The findings of this analysis aim to understand the nature of interactions and networks, and to allow practitioners to take necessary precautions to mitigate far-right activities on various online platforms.
|
2025 |
Hossain, T., Akbas, E., Lemieux, A.E. and Massignan, V. |
View
Publisher
|
VOX-Pol Blog |
Arrested War: After Diffused War
View Abstract
|
2015 |
Hoskins, A. and O’Loughlin, B. |
View
Publisher
|
Report |
Does Platform Migration Compromise Content Moderation? Evidence from r/The_Donald and r/Incels
View Abstract
When toxic online communities on mainstream platforms face moderation measures, such as bans, they may migrate to other platforms with laxer policies or set up their own dedicated website. Previous work suggests that, within mainstream platforms, community-level moderation is effective in mitigating the harm caused by the moderated communities. It is, however, unclear whether these results also hold when considering the broader Web ecosystem. Do toxic communities continue to grow in terms of user base and activity on their new platforms? Do their members become more toxic and ideologically radicalized? In this paper, we report the results of a large-scale observational study of how problematic online communities progress following community-level moderation measures. We analyze data from r/The_Donald} and r/Incels, two communities that were banned from Reddit and subsequently migrated to their own standalone websites. Our results suggest that, in both cases, moderation measures significantly decreased posting activity on the new platform, reducing the number of posts, active users, and newcomers. In spite of that, users in one of the studied communities (r/The_Donald) showed increases in signals associated with toxicity and radicalization, which justifies concerns that the reduction in activity may come at the expense of a more toxic and radical community. Overall, our results paint a nuanced portrait of the consequences of community-level moderation and can inform their design and deployment.
|
2020 |
Horta Ribeiro, M., Jhaver, S., Zannettou, S., Blackburn, J., De Cristofaro, E., Stringhini, G. and West, R. |
View
Publisher
|
Journal Article |
Deplatforming did not decrease Parler users’ activity on fringe social media
View Abstract
Online platforms have banned (“deplatformed”) influencers, communities, and even entire websites to reduce content deemed harmful. Deplatformed users often migrate to alternative platforms, which raises concerns about the effectiveness of deplatforming. Here, we study the deplatforming of Parler, a fringe social media platform, between 2021 January 11 and 2021 February 25, in the aftermath of the US Capitol riot. Using two large panels that capture longitudinal user-level activity across mainstream and fringe social media content (N = 112, 705, adjusted to be representative of US desktop and mobile users), we find that other fringe social media, such as Gab and Rumble, prospered after Parler’s deplatforming. Further, the overall activity on fringe social media increased while Parler was offline. Using a difference-in-differences analysis (N = 996), we then identify the causal effect of deplatforming on active Parler users, finding that deplatforming increased the probability of daily activity across other fringe social media in early 2021 by 10.9 percentage points (pp) (95% CI [5.9 pp, 15.9 pp]) on desktop devices, and by 15.9 pp (95% CI [10.2 pp, 21.7 pp]) on mobile devices, without decreasing activity on fringe social media in general (including Parler). Our results indicate that the isolated deplatforming of a major fringe platform was ineffective at reducing overall user activity on fringe social media.
|
2023 |
Horta Ribeiro, M., Hosseinmardi, H., West, R. and Watts, D.J. |
View
Publisher
|