Journal Article |
Distinct patterns of incidental exposure to and active selection of radicalizing information indicate varying levels of support for violent extremism
View Abstract
Exposure to radicalizing information has been associated with support for violent extremism. It is, however, unclear whether specific information use behavior, namely, a distinct pattern of incidental exposure (IE) to and active selection (AS) of radicalizing content, indicates stronger violent extremist attitudes and radical action intentions. Drawing on a representative general population sample (N = 1509) and applying latent class analysis, we addressed this gap in the literature. Results highlighted six types of information use behavior. The largest group of participants reported a near to zero probability of both IE to and AS of radicalizing material. Two groups of participants were characterized by high or moderate probabilities of incidental exposure as well as a low probability of active selection of radicalizing content. The remaining groups displayed either low, moderate, or high probabilities of both IE and AS. Importantly, we showed between-group differences regarding violent extremist attitudes and radical behavioral intentions. Individuals reporting near zero or high probabilities for both IE to and AS of radicalizing information expressed the lowest and strongest violent extremist attitudes and willingness to use violence respectively. Groups defined by even moderate probabilities of AS endorsed violent extremism more strongly than those for which the probability for incidental exposure was moderate or high but AS of radicalizing content was unlikely.
|
2024 |
Schumann, S., Clemmow, C., Rottweiler, B. and Gill, P. |
View
Publisher
|
Journal Article |
Fascist cross-pollination of Australian conspiracist Telegram channels
View Abstract
The COVID-19 pandemic has brought about trauma and uncertainty for vast swathes of the world population, including in Australia. One effect of this has been the growth of COVID-19 conspiracy theories, and general conspiracism. This article explores efforts by fascists and neo-Nazis to exploit the rise in conspiratorial thinking for recruitment and dissemination of their ideas. Five Australian conspiracist Telegram channels are studied for signs of fascist cross-pollination, and it is found that users with fascist sympathies attempt to influence the channels’ discourse through appeals to purported ideological and situational commonalities.
|
2021 |
Gill, G. |
View
Publisher
|
Journal Article |
Online influence, offline violence: language use on YouTube surrounding the ‘Unite the Right’ rally
View Abstract
The media frequently describes the 2017 Charlottesville ‘Unite the Right’ rally as a turning point for the alt-right and white supremacist movements. Social movement theory suggests that the media attention and public discourse concerning the rally may have engendered changes in social identity performance and visibility of the alt-right, but this has yet to be empirically tested. The presence of the movement on YouTube is of particular interest, as this platform has been referred to as a breeding ground for the alt-right. The current study investigates whether there are differences in language use between 7142 alt-right and progressive YouTube channels, in addition to measuring possible changes as a result of the rally. To do so, we create structural topic models and measure bigram proportions in video transcripts, spanning approximately 2 months before and after the rally. We observe differences in topics between the two groups, with the ‘alternative influencers’, for example, discussing topics related to race and free speech to a larger extent than progressive channels. We also observe structural breakpoints in the use of bigrams at the time of the rally, suggesting there are changes in language use within the two groups as a result of the rally. While most changes relate to mentions of the rally itself, the alternative group also shows an increase in promotion of their YouTube channels. In light of social movement theory, we argue that language use on YouTube shows that the Charlottesville rally indeed triggered changes in social identity performance and visibility of the alt-right.
|
2020 |
van der Vegt, I., Mozes, M., Gill, P. and Kleinberg, B. |
View
Publisher
|
Journal Article |
The temporal evolution of a far-right forum
View Abstract
The increased threat of right-wing extremist violence necessitates a better understanding of online extremism. Radical message boards, small-scale social media platforms, and other internet fringes have been reported to fuel hatred. The current paper examines data from the right-wing forum Stormfront between 2001 and 2015. We specifically aim to understand the development of user activity and the use of extremist language. Various time-series models depict posting frequency and the prevalence and intensity of extremist language. Individual user analyses examine whether some super users dominate the forum. The results suggest that structural break models capture the forum evolution better than stationary or linear change models. We observed an increase of forum engagement followed by a decrease towards the end of the time range. However, the proportion of extremist language on the forum increased in a step-wise matter until the early summer of 2011, followed by a decrease. This temporal development suggests that forum rhetoric did not necessarily become more extreme over time. Individual user analysis revealed that super forum users accounted for the vast majority of posts and of extremist language. These users differed from normal users in their evolution of forum engagement.
|
2020 |
Kleinberg, B., van der Vegt, I. and Gill, P. |
View
Publisher
|
Journal Article |
Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy
View Abstract
Social media platforms have profoundly transformed cultural production, in part by restructuring the terms by which culture is distributed and paid for. In this article, we examine the YouTube Partner Program and the controversies around the “demonetization” of videos, to understand these arrangements and what happens when they shift beneath creators’ feet. We use the testimony of YouTubers, provided in their own videos, to understand how creators square the contradiction between YouTube’s increasingly cautious rules regarding “advertiser-friendly” content, its shifting financial and algorithmic incentive structure, and its stated values as an open platform of expression. We examine YouTube’s tiered governance strategy, in which different users are offered different sets of rules, different material resources, and different procedural protections when content is demonetized. And we examine how, especially when the details of that tiered governance are ambiguous or poorly conveyed, creators develop their own theories for why their content has been demonetized—which can provide some creators a tactical opportunity to advance politically motivated accusations of bias against the platform.
|
2020 |
Caplan, R. and Gillespie, T. |
View
Publisher
|
Report |
Shedding Light On Terrorist And Extremist Content Removal
View Abstract
Social media and tech companies face the challenge of identifying and removing terrorist and extremist content from their platforms. This paper presents the findings of a series of interviews with Global Internet Forum to Counter Terrorism (GIFCT) partner companies and law enforcement Internet Referral Units (IRUs). It offers a unique view on current practices and challenges regarding content removal, focusing particularly on human-based and automated approaches and the integration of the two.
|
2019 |
Vegt, I.V.D. Gill, P., Macdonald,S. and Kleinberg, B. |
View
Publisher
|