Between Suspicion and Selection: How Virtual Extremist Communities Filter Newcomers

By Christopher V. David and Marten Risius

Extremist groups operate in an environment where trust is existential and suspicion is constant—every new member could be an ally or an informant. The Provisional IRA, for example, famously operated a dedicated squad tasked with tracking down and liquidating informers within their ranks. Though in a spectacular twist, it would turn out later that the leader of this squad was a mole himself, “the golden egg” of British intelligence that had devastated the organization’s operational security for years. Such examples illustrate the severe implications of ineffective filtering. 

To prevent such infiltration, extremist groups rely on strong vetting and filtering mechanisms to select loyal members. In offline contexts, a central strategy for this is what Karsten Hundeide (2003) termed “deep commitment”: Requiring recruits to perform certain acts that prove loyalty. In his ethnographic study of extremist youth groups, this included requiring newcomers to commit certain crimes that prevented them from easy disengagement at a later point. In a particularly gruesome example, the so-called Islamic State vetted their child soldiers by pushing them to commit violence against their own families. 

In an online environment where anonymity reigns, such physical tests of loyalty are difficult. Consequently, many virtual spaces are vulnerable to infiltration, and openly accessible extremist web forums are regularly monitored by outsiders. For instance, antagonistic communities on platforms such as Reddit infiltrate incel forums to harvest content and ridicule their members. Faced with this constant threat, extremist communities are forced to devise protective mechanisms adapted to the digital world. In our recently published comparative case study based on ethnographic material, we examine how virtual extremist communities filter members. We find striking differences between two extremist communities: a misogynistic incel forum and a white supremacist forum.

Hostility as a deliberate selection mechanism for incels

The incel community filters members through a hostile process we call “Trial by fire”. On the forum newcomers post directly to the general feed where they are put through what one established member called a “wringer”. Newcomers are assigned a visibly low-status label and are often met with constant hostility. Their novice status is indicated by a greyed-out username, which remains in place until they have made 500 posts. 

This hostility is not random trolling but rather a deliberate selection and onboarding mechanism. We observed multiple established members reflecting on the strategic purpose of their hostility.  One said: “I think if [the forum] didn’t put people through the wringer (…) we would have even more infiltration. (…) I think it is a defence mechanism because outsiders/other groups/redditors etc. all hate us (…).” Another user agreed, stating that he says “things to weasel them {infiltrators} out”. The amount of hostility is moderated by specific criteria: 

  • Formal Rule Compliance: While some rules are generic (such as do not spam), others are specific for incel subculture (e.g. do not brag about receiving female attention). In any case, rules are strictly enforced. For example, a newcomer who violated the ban on LGTBQ+ content in his first post was targeted for harassment and reported to moderators. 
  • Implicit Assumptions: Newcomers are expected to pick up on unwritten community norms. For example, while hateful tirades against women are a standard posting practice, targeting law enforcement is met with suspicion. A recruit who directly attacked police was met with an emoji symbolizing a federal agent (a “glowie”), marking him as a suspected infiltrator. 
  • Hierarchy: Newcomers must “know their place”. While the forum is a “refuge for the wicked” for some established members, newcomers are denied any empathy. One recruit shared his fear of being expelled from university for making threatening remarks about women, and established members responded with severe verbal abuse and calls for self-harm. 

Ideological Engagement: We observed that hostility decreased if a newcomer demonstrated deep knowledge or interest in the incels’ fatalistic ideology. For example, the post of one newcomer that compiled “scientific proof” for the underlying assumptions was moved to the must-read section of the forum.

Mission-driven selection in the white supremacist forum

On the white supremacist forum, the filtering process is vastly different and comparatively more structured. We refer to the filtering process as “mission-driven”, due to its educational character. Newcomers are expected to introduce themselves in a dedicated subforum and detail their motivation for joining the “movement”. Self-study of white supremacist ideology is expected, with established members expecting newcomers to “read, read, read” dedicated threads explaining central tenets of white supremacist ideology. 

While a general “siege mentality” is present on the forum, with some members suspecting that a significant portion of users may be infiltrators or ideological opponents, recruits who follow this protocol are generally welcomed warmly. In their introductory threads, established members judged recruits primarily based on two key criteria: 

Ideological Alignment: A recruit’s stated motivation for joining is crucial. A newcomer from Australia who detailed how “mass migration” was destroying his country was welcomed for reflecting key community beliefs. Another user gained instant approval by using coded language (adding capitalized double SS in a sentence), signalling allegiance to National Socialist thought. 

Racial suitability: This is the primary filter criterion. The forum limits membership only to “100-percent white people of European descent”. There is constant suspicion of infiltration attempts by “inferior races”. When a self-declared ethnic Albanian introduced himself, he was immediately asked whether he was “racially white”. He attempted to prove his belonging by sharing a picture of himself, but his whiteness was deemed insufficient, and he was barred from participation. 

Anonymity and the lack of deep commitment

We believe that our results show two vastly different ways in which virtual extremist communities attempt to replace the “deep commitment” mechanisms of offline groups. The anonymous online environment leads both incels and white supremacists to be constantly wary of infiltration. However, the communities’ distinct goals influence their filtering strategies: 

  • The incel forum applies a “trial by fire” approach: They drive away both potential allies and infiltrators through fierce hostility. This is primarily because established members prioritize creating a “refuge for the wicked” instead of reaching clear political goals. They can therefore afford to filter out some genuine incels. Similar strategies of hostility as a filtering tactic have also been reported by Lee and Knott for a fascist forum, suggesting a potentially broader pattern of hostility as a selection mechanism. 
  • The white supremacist forum is mission-driven: The community hopes to awaken “racial consciousness” in as many white people as possible. Therefore, once a recruit passes initial racial and ideological screening, hostility is seen as counterproductive. 

Our findings highlight how extremist communities choose different strategies moderated by their broader community goals when adapting their vetting and filtering processes. For future research, it would be particularly promising to compare such filtering processes in virtual communities of contrasting ideologies (as incels and white supremacists do share certain overlaps), or in closed virtual spaces, in which moderators are primarily responsible for granting access to the group. 


Christopher V. David is a researcher and PhD candidate at the Neu-Ulm University of Applied Sciences, Germany. His work examines online extremism through a socio-technical lens, with a particular focus on how extremist communities communicate, learn, and adapt across digital environments. He holds a degree in Psychology (M.Sc.) and Security Studies (M.A.) https://bsky.app/profile/dsrc.bsky.social

Marten Risius is Professor for Digital Society and Online Engagement at Neu-Ulm University of Applied Sciences, Germany and Adjunct Senior Fellow at the School of Psychology at the University of Queensland, Brisbane, Australia. His position receives generous funding support from the Bavarian State Ministry of Science and the Arts through the Distinguished Professorship Program as part of the Bavarian High-Tech Agenda. https://bsky.app/profile/risius.bsky.social