By Sydney Litterer, Ryan Scrivens, Thomas W. Wojciechowski, and Richard Frank
This article summarizes a recent study published in Behavioral Sciences of Terrorism and Political Aggression.
Research exploring the online posting patterns and behaviors in spaces known for facilitating violent extremism, especially in online spaces steeped in racially and ethnically motivated (REM) extremist ideologies, has grown substantially over the past decade. Both academic and law enforcement interest has grown from ongoing reports of violent REM extremists and terrorists being active in forums and social media platforms before their attack. One of the most notable examples is that of neo-Nazi terrorist, Anders Breivik, who murdered eight people in a van bombing and 69 others in a mass shooting in Norway in 2011. Breivik was a user of Stormfront, the largest and most well-known white supremacy forum, and hours before his attacks he emailed a manifesto to two influential Stormfront members.
Existing research on patterns in posting behaviors within online REM extremist communities has overwhelmingly found that a small number of highly active users often contribute disproportionately to community discourse. Yet evidence is mixed on whether such users—and users in general—typically increase or decrease their engagement with REM extremist online communities over time, particularly relative to other users. Although some work has explored temporal patterns in both language and posting behavior within these communities, it is also unclear how patterns in language and posting behavior are related to one another. This is important because examining users’ online behavior relative to others posting at the same time allows researchers, practitioners, and policymakers to gauge whether users’ behavior changes as they spend more time interacting with the forum independently of what the overall forum trends in behavior are for the time during which they are posting. Further, understanding the ways in which language is associated with posting behaviors could aid in better identifying pathways toward radicalization and acceptance of extremist views or how negative sentiment and hostility intensifies the longer users engage in an online community.
Accordingly, this study uses a group-based multi-trajectory analysis of data from the influential white supremacist forum Stormfront to identify interrelated temporal patterns in posting frequency, use of vulgarity, and use of threats. The study examined how such online behaviors evolve in comparison with typical user posting behaviors in an online platform (i.e., relative measures). Several conclusions can be drawn from this study.
First, two distinct trajectory groups were identified in Stormfront: a small proportion of forum users who posted with high frequency over time and a much larger proportion of users who posted infrequently over time. This finding supports prior research noting that high-frequency posters, or ‘super-posters,’ make up a very small proportion of all posters. Research suggests that the high-frequency posters, whether in REM online communities or online communities in general, tend to dominate much of the online discussions. Further study is needed to explore the extent to which high-frequency posters influence the attitudes, beliefs, motivations, and/or behaviors of others within the online subculture. The use of social network analyses coupled with content analysis of users’ posts would illustrate the extent to which posting frequency and content are associated with potential influence. It could also be informative to explore whether factors such as the date users joined Stormfront or their length of interaction with the forum predict which posting frequency and language use trajectories they follow.
Second, the posting frequency and likelihood of using offensive language (i.e., vulgarity and threats) for both posting trajectory groups did not change over time compared to average posting behavior for users in the sample. Although this study only considered the first twelve months after users joined the forum, this time span captured the entire lifespan of forum interaction for 75% of users. The relative stability in both posting frequency and language use suggests that, as users continued to engage with Stormfront during their first year on the forum, they did not shift their behavior to better align with the norm for behavior within the sample (see Figure 1).

However, because average posting frequency and use of offensive language within the sample may not reflect norms within the entire forum, more work is needed to understand whether users are learning subcultural norms for posting behavior through their interaction with the forum. Indeed, gaining a better understanding of how and to what extent learning is occurring within online communities would strengthen our understanding of the role of online communities in facilitating radicalization to extremist violence. It is important to note that, because we used relative measures, these results do not conflict with those of studies showing changes in absolute measures of posting frequency and language use over time.
Third, over users’ first twelve months on the forum, the offensive language trajectories (i.e., vulgarity and threats) for the small but highly active posting group were noticeably similar to those of the average user in the sample. On the other hand, the offensive language trajectories for the much larger group of infrequent posters were lower than those of the average user in the sample. Such differences between the highly active and less active groups were more pronounced for use of vulgarity than threats. Because threats of violence could be more likely to draw law enforcement attention to the forum, it is possible that the use of threatening language was more tightly controlled in the forum through either formal means, such as moderation, or informal means, such as censure by other forum users. However, more work is needed to understand the role moderation and self-censorship play in the type of language used in REM extremist online forums. It is likely that the similarity of the highly active group’s offensive language trajectories to average language use was in part the result of the high-frequency group dominating much of the discourse in the sample. As previously noted, more work exploring the influence of highly active users on the behavior of other users would be useful in determining whether these users’ extensive engagement with the forum is allowing them to set norms for the use of offensive language in the forum.
While this study offers a first step in assessing how users’ online posting behaviors evolve on Stormfront relative to web-forum norms, there are several limitations that may inform future research. First, although trajectory modeling provides a powerful method for identifying general patterns of development within otherwise intractable longitudinal data, these patterns should be interpreted with caution because trajectory groups do not represent concrete entities that users follow across time in lockstep. Rather, they represent very general approximations of patterns of development and thus should not be reified as being anything more concrete than this. Second, the number of users sampled was small relative to the number of users in the entire forum, and as such our analysis may not capture the full range of language and posting behavior variation in the forum. This means there may be additional posting patterns in the forum that are not identified by this analysis. Third, it is likely that certain forum characteristics (e.g., the topics of conversation, users and groups who post in the space, and so on) account for some of the results of the current study, as noted in other work exploring online posting behavior and language use by extremists. Researchers should examine posting behaviors within and across sub-forums found in the broader forum as well as across various platform types—such as a comparison of violent REM group forums with generic, non-violent forums, mainstream social media sites, fringe platforms, and digital applications—and over various times periods where extremist discourse may have changed. Such comparisons would provide practitioners and policymakers with much needed insight into whether specific online behaviors span online spaces that facilitate extremism more generally or whether certain platforms have unique functions for facilitating extremism and associated posting compositions. Lastly, future work should assess and integrate various theoretical frameworks that may be useful in examining the online posting behaviors of users in online spaces that facilitate violent extremism, as doing so would provide key stakeholders with much needed insight into the drivers of online posting behaviors.
Sydney Litterer is a Doctoral Candidate (ABD) in the School of Criminal Justice at Michigan State University (MSU).
Ryan Scrivens is an Assistant Professor in the School of Criminal Justice at MSU. He is also an Associate Director at the International CyberCrime Research Centre (ICCRC) and a Research Fellow at the VOX-Pol Network of Excellence.
Thomas W. Wojciechowski is an Assistant Professor in the School of Criminal Justice at MSU.
Richard Frank is a Professor in the School of Criminology at Simon Fraser University and the Director of the ICCRC.