The UK’s Online Safety Act and ‘Terrorist Content’

By Katy Vaughan

Ofcom, as the UK’s independent regulator of ‘Online Safety’ is taking a phased approach to the implementation of the Online Safety Act, which passed into law in October 2023. In December 2024 Ofcom published its statement on the progress of the first phase of implementation which focuses on ‘illegal harms’, including the publication of the ‘Illegal Content Codes of Practice’ (issued in February 2025). This post provides a brief introduction to the main duties placed on online service providers (OSPs) relating to ‘illegal content’ under the Act, before focusing specifically on the meaning of terrorist content.

Risk Assessment Duties and Safety Duties

Under the Act, duties apply to both user-to-user services and search services, and whilst these are addressed separately the duties are broadly similar. There are two main sets of duties: (1) risk assessment duties, and (2) safety duties. OSPs were required to complete their first illegal content risk assessments by 16 March 2025, and from 17 March 2025 they must comply with the safety duties designed to mitigate exposure to illegal content.

The safety duties require OSPs to take proportionate steps to: prevent users from encountering illegal content (user-to-user services) or minimise the risk of users encountering illegal content (search services); mitigate the risk of the service being used for the commission or facilitation of a priority offence; and, mitigate the risk of harm to users which results from the presence of illegal content. User-to-user services must also use proportionate systems to:

  • Minimise the length of time priority illegal content is present on services, and
  • Swifty take down such content, once the OSP is alerted to its presence on the service.

Illegal Content

The duties apply to what the Act states is ‘illegal content’ which is content that amounts to a relevant offence – comprising of both ‘priority’ and ‘non-priority’ offences. Terrorist content is a category of ‘priority illegal content’, and therefore subject to the illegal content duties. Given the powers conferred by the legislation involving the removal of online content and potential use of proactive measures to address such content – which has impacts for individuals’ freedom of expression – it is important to examine how terrorist content is defined.

What is Terrorist Content?

Section 59 of the Online Safety Act defines ‘terrorism content’ by reference to existing terrorism offences in UK law that criminalise preparatory, facilitative and supportive conduct. Set out in Schedule 5 of the Act these include – for example – offences such as: inviting support for a proscribed organisation, encouragement of terrorism, training for terrorism, and terrorist fundraising. This approach to defining what is meant by terrorist content presents a number of difficulties.

First, these offences cover a broad range of conduct and individuals potentially far removed from actual acts of terrorism. Which is to be considered alongside the UK’s statutory definition of terrorism which has been widely criticised for being overly broad, and does not include express exemptions for advocacy, protest, and dissent. Broad definitions result in significant discretion vested in decision makers such as regulatory bodies, human moderators or automated tools. This could lead to an overly cautious or inappropriate application of the legislation, which can have unintended consequences for the protection of human rights.

Second, terrorism offences were not designed to be used in the context of regulating online content. Counterterrorism legislation is primarily directed at conduct not content and content in itself cannot ‘amount to’ an offence. As stated by the Independent Reviewer of Terrorism Legislation during the drafting stages of the Act: ‘the commission of offences requires conduct by a person or people.’ Criminal offences generally require consideration of an individual’s state of mind/mental elements, such as their intention, and may be subject to a defence. In an attempt to recognise this section 192 of the Act provides that in determining whether content is illegal content, OSPs should have ‘reasonable grounds to infer’ that all the elements necessary for the commission of the offence are satisfied (including mental elements), and that there are no reasonable grounds to infer that a relevant defence is present. And, Ofcom has provided guidance for OSPs, specific to each category of illegal content including terrorism. Regardless of additional guidance, the result is that OSPs are required to make judgements about conduct, state of mind, and defences which can be complex and difficult to operationalise particularly at scale, and with the use of automated tools.

Finally, to establish that the elements of the offences are satisfied additional context such as ‘extrinsic factual material’ is necessary. In R v Amjad the defendant was convicted of the section 58 offence of collecting/possessing information likely to be useful to a terrorist. The information, or content, was a list of fitness exercises. The additional context determinative in this case was that the fitness regime was headed ‘Mujahid minimum training’ and the document was found to have been derived from additional sources available online that were associated with a terrorist cause. The importance of the context in which terrorist content appears online is recognised by Ofcom and emphasised in the Illegal Content Judgment Guidance. The guidance clearly states that it is not an offence to portray or to report on terrorism, or to make jokes about terrorism – ‘even where these are offensive or in poor taste’. And suggests that OSPs consider contextual factors such as: the original author of the content, the existing surrounding circumstances in which the content is posts (for example in the aftermath of a particular event), and the apparent purpose of the person making the statement. All of which are important in the interests of accuracy in identifying terrorist content online. Yet, the importance of the context is a significant challenge in defining terrorist content, and algorithmic tools in particular are not well-suited to make such judgments at scale. They also lack cultural sensitivity.

Concluding Thoughts and Next Steps

The Online Safety Act’s approach to defining terrorist content, is problematic for a number of reasons some of which have been outlined here. Ofcom has sought to provide guidance to OSPs in interpreting and applying this aspect of the legislation in the Illegal Content Judgment Guidance. Whilst the emphasis in the guidance on the importance of purpose, meaning, and context of content is to be welcomed, it remains to be seen how straightforward this will be to operationalise online, and at scale, both for human reviewers and automated tools.

Dr Katy Vaughan is a Senior Lecturer in Law at Swansea University, a member of the Cyber Threats Research Centre (CYTREC) and VOX-Pol, and currently one of the Co-Chairs of the Christchurch Call Advisory Network.

IMAGE CREDIT: PEXELS

Want to submit a blog post? Click here.