Researching the Dark Playground: Young People’s Exposure to Extremist Content Online

By Tim Legrand, Nathan Manning, and Melissa-Ellen Dowling

Young people worldwide are increasingly exposed to violent ideas and ideologies in web-based communities. According to national security agencies, children as young as 12 are adopting extremist beliefs, folding issues of child safeguarding into national security concerns. While we know that exposure to extremist ideologies in online settings does not necessarily translate straightforwardly to political violence in the ‘real world’, in each of the Five Eyes states – Australia, the UK, Canada, New Zealand, and the US – the involvement of children in violent extremism is becoming increasingly a focus for policing and security agencies. Indeed, the Australian Security Intelligence Organisation has reported that in 2024 ‘almost all’ of their investigations into ‘potential terrorism’ concerned minors, and  ‘they allegedly moved towards violence more quickly than we have seen before’. The UK’s domestic security agency MI5 similarly reports that 13% of people investigated for terrorism are under 18, ‘a threefold increase in the last three years’. It’s certainly no surprise then that in October 2024 the Five Eyes law enforcement and security agencies released a joint statement urging for collective action to ‘identify and counter radicalisation of minors to violent extremism’, stressing their shared view minors can pose just as much of a terrorist threat as adults.

An additional concern for Five Eyes governments is the apparent emergence of extreme ideologies that are chimeric, heterogeneous and inchoate, but nonetheless are finding a foothold in online spaces. These ideologies, in which misogyny and racism tend to dominate, blend a constellation of religious, nationalist, political and traditional belief systems. Examples of inchoate ideologies are on display in violent ‘manifestos’ of individuals who have committed acts of mass violence. Because these ideologies lack the consistency and cogency of typical political ideologies, how they motivate individuals into violence is yet to be fully understood. In the UK, this challenge has prompted a parliamentary inquiry into the “new forms of extremism”.

There are also a range of impacts beyond political violence – though that is a primary concern for national security agencies – that ought to worry us about young people’s exposure to extremist content online. There are innumerable social and psychological consequences for children and communities that make this a broader societal problem, including individual-level psychological harm caused by young people’s exposure to, and engagement with, extremist content, the degradation of social cohesion within communities, the possible securitisation of young people; the risks to teachers and pupils in schools; and the injurious impacts on families – parents, siblings, and caregivers. The task of protecting young people from extremist digital content is driven by these multiple imperatives.

Despite the recognised severity of the challenge of youth exposure to online extremism in Australia, we still have little systematic knowledge about:

  • What extremist content young people are encountering online;
  • The demography of the young people encountering it;
  • How much of it are they seeing, and how frequently;
  • How they encounter it (are they “active seekers” of extremist content?);
  • Where they encounter it;
  • How they engage with it; and
  • How they respond to and feel about such content.

There are no quick or easy answers to be found here. From the perspective of academic researchers, it has never been harder to uncover knowledge of patterns and characteristics of how people spend their time online, and especially their exposure to harms. Social media platforms are now ‘dark’ digital spaces, where young people can spend their leisure time – playing games, socialising, making and maintaining friendships – in anonymity under layers of encryption that obscure users and their engagement with online communities. Notwithstanding the manifold benefits of privacy protections, we must confront – and hopefully overcome – the reality that observation of specified cohorts’ digital lives and exposure to harms these ‘dark playgrounds’ is almost impossible, and certainly to the usual standards of robustness expected for scholarly research.

To gain insight into these questions in the Australian context, we are investigating digital pathways to violent extremism in young Australians (ages 12-17). Our research, funded by Australia’s Office of National Intelligence, aims to uncover the extent to which young Australians are exposed to violent extremist ideologies online, and to identify key pathways to online radicalisation for this cohort. Through doing this, we aspire to isolate intervention points to curtail the reach and influence of online ideological extremist groups and content. We seek to understand what extremist content resonates with young people and why, so that we have more knowledge about how to mitigate this growing problem. In addition, we hope to learn from the many young people who encounter such content, but ultimately reject it and use the internet and social media in more prosocial ways.

To find out about what extremist content is available to youth online, we are mapping ‘gateway extremist’ content on the social media platforms that are most utilised by Australian minors. At the time of writing this blog post, these platforms are Instagram, Tik Tok, and Discord. Given the moderation practices of these platforms, we are not expecting to find violent extremist content, but are instead creating a dataset of ‘gateway’ hate-speech and extremist ideas that may segue to violent and extremist ideologies.

Our next step is to uncover the pathways that lead youth to extremist material online, by reverse engineering processes of exposure to violent or risky online content. We’re doing this by collecting and assessing data on the internet browsing patterns of 12–17-year-olds in Australia, paying particular attention to links between gateway mainstream content and violent extremist material on less moderated websites and platforms. Next, we’re evaluating the extent to which extremist ideologies resonate with youth. We are conducting a nation-wide survey of young Australians to examine levels of sympathy or support for extremist values, attitudes, behaviours, and beliefs.

We are hopeful that our research will generate knowledge on youth exposure to online violent ideologies that can help to build early intervention strategies and, ultimately, contribute toward curtailing the escalating problem of youth extremism. We look forward to reporting back to the blog with our findings.

Tim Legrand is Professor of International Security at the University of Adelaide and has been awarded an Australian Research Council Future Fellowship (2025-29). His research concerns global blacklisting and sanctions, digital security, political violence and political exclusion. His books include Bad Public Policy: Malignity, Volatility, and the Inherent Vices of Policy-Making (2025, with Michael Howlett & Ching Leong, Cambridge Uni Press); The Architecture of Policy Transfer (2021, Palgrave Macmillan), and Banning them, Securing us? The Politics of Proscription. (2020, With Lee Jarvis, Manchester Uni Press). He is currently co-editor of the Australian Journal of International Affairs. Profile: https://researchers.adelaide.edu.au/profile/tim.legrand

Nathan Manning is Senior Lecturer and Head of Department for Sociology, Criminology and Gender Studies, The University of Adelaide. As a political sociologist he focuses on young citizens and the role of emotions in politics and citizenship. His current projects focus on young people’s exposure to extremist content online and the different ways in which university education may undermine support for far-right politics. His work has been funded by the ONI, ESRC and the British Academy. He is founding Co-Editor of the journal Emotions and Society. Profile: https://researchers.adelaide.edu.au/profile/nathan.manning 

Dr Melissa-Ellen Dowling is Senior Lecturer of Politics and International Relations at the Jeff Bleich Centre for Democracy and Disruptive Technologies, Flinders University. Her research concerns political extremism and threats to democracy, with an emphasis on exploring the role of online communications in the proliferation of illiberal ideologies. Melissa’s most recent publications appear in journals such as Information, Communication & Society, New Media & Society, and Policy and Society. She is the author of the book Writing Russia (2020), and editor of the book Digital (Dis)Information Operations (2025). Profile: https://www.flinders.edu.au/people/melissaellen.dowling

IMAGE CREDIT: PEXELS

Want to submit a blog post? Click here.