The Potential of Short Form Videos as P/CVE Messages

By Joe Whittaker, Farangiz Atamuradova, Kamil Yilmaz, Simon Copeland, Lilah El Sayed, Jon Deedman

Short form video has, put simply, become one of the most popular social media formats on the Internet. By “short form” we mean videos of around 30-90 seconds; each platform that utilises it has their own specifications about the minimum and maximum length, but the underlying spirit of the format is to keep videos “short and sweet.”  The biggest success story of this format is TikTok, which since its launch in 2016 rose meteorically to a billion monthly users in 2021. One marker of its success is the fact that other platforms quickly imitated it by implementing their own short form capabilities, such as YouTube “Shorts” and “Reels” on Instagram and Facebook. The format has been particularly popular amongst younger users. Pew Research recently found that TikTok, YouTube, Snapchat, and Instagram were the most widely used platforms amongst teenagers in the US. In this blog, we will outline some of the ways that this format is currently being exploited by extremists, before moving on to two reports which we have recently published which explore the ways in which the format can be utilised by content creators who seek to counter extremism.

Given the widespread popularity of this format, it follows that bad actors have sought to exploit short form video platforms; extremists are often noted for their use of new technologies. A report for the Institute for Strategic Dialogue by Ciarán O’Connorfound that TikTok promoted white supremacist content and it was possible to find crisis footage (such as the 2019 Christchurch terror attack livestream). Looking at UK-based far-right content on the same platform, Ozduzen and colleagues found that videos contained narratives which bolstered ideas of nativism and white supremacy amongst users. In particular, TikTok’s recommendation algorithm has been signalled out as ripe for exploitation. Weimann & Masri highlight that it pushes young people to unintentionally view antisemitic content and after they do so, will begin to show them more, exacerbated by the platform’s lax moderation policies. Similarly, Shin and Shin and Grandinetti & Bruinsma both conduct empirical investigations which highlight the recommendation system promoting extreme content towards users. A frequent concern about the exploitation of short form videos – particularly TikTok – is the young user base, who, it is argued, may be more vulnerable to exploitation by extremists.

Given that extremists are exploiting this format, it follows that there should be some kind of proactive Preventing and Countering Violent Extremism (P/CVE) response, particularly given the practical limitations and human rights implications of relying solely on content moderation. Moreover, there are already actors in this space who are seeking to create prosocial messages, such as the “JewToks”, who respond to anti-semitic narratives in two ways: Firstly by creating positive narratives to educate their audience about Jewish history and culture, and secondly, by playfully mocking racist stereotypes. Similarly, Lee and Lee analyse a corpus of videos with the hashtag #StopAsianHate to understand the ways in which female content creators share their stories and form solidarity when challenging anti-Asian racism.

The logic of our reports is as follows: i) It is important not to rely solely on content removal as a means of countering extremism; ii) there are already content creators working in this space; and iii) it is possible that many content creators do not have a strong understanding of the lessons learned within the P/CVE space over the past two decades. Of course, we understand that mass media persuasion is fraught with difficulties and many mistakes have been made in the past. However, we believe that there is enough worthwhile knowledge to pass on to content creators.

The first report, A Guide for Creators Making Content to Counter Extremism, is aimed specifically at content creators who are making – or thinking about making – prosocial P/CVE content. It is intended to be accessible and written using non-technical language that a young person (potentially still in secondary education) can read and understand. It contains twelve simple messages, including: the importance of staying local; the use of storytelling; and active engagement with audiences. The report is deliberately non-prescriptive; we understand that content creators know their audiences better than we do and therefore they should allow their authenticity to shine through. This is particularly prescient given the importance of a message having a credible messenger, rather than content that has a mass-produced feel. Therefore, the report is a set of guiding principles which present several considerations that creators may find useful to keep in mind when crafting their message.

The second report, Strategic Communications for Countering Extremism in the Digital Age,  provides the underlying strategic logic for the first. In it, we provide the scholarly background for our guidelines, drawing on research from persuasive communications; the field of P/CVE; and specific considerations for the short form video format. It contains a synthesis of existing guidelines for the creations of P/CVE strategic communications, as well as considerations for how stakeholders may monitor, measure, and evaluate campaigns. This is particularly important given that almost every social media platform provides some kind of incentive programme for content creators who are seeking to disseminate prosocial messages – such as TikTok’s “Creators Forward” programme or YouTube’s “Creators for Change.” As such, it is vital that platforms think about robust ways to evaluate the efficacy of such programmes.

As social media platforms continue to grow and innovate, so will extremists’ exploitation of them, and therefore, so will the need to find ways to counter them. Our hope is that our contributions to the discussions into short form video are able to provide guidance to future generations of P/CVE content creators and to facilitate dialogue between them, researchers, and tech companies about the most productive ways to create prosocial content on their platforms.

Joe Whittaker is a Senior Lecturer in Criminology, Sociology, and Social Policy at Swansea University

Farangiz Atamuradova is a Program Officer at Hedayah

Kamil Yilmaz is a Lecturer in Criminology, Sociology, and Social Policy at Swansea University

Simon Copeland is a Research Fellow at the Royal United Services Institute (RUSI)

Lilah El Sayed is a Program Associate at Hedayah

Jon Deedman is a postgraduate student at Richmond the American International University in London

Want to submit a blog post? Click here.

Image Credit: PEXELS

Alternative links to the documentaries: