This article summarises one of the recent outputs of a sub-group of GIFCT’s Legal Frameworks Working Group 2022.
By Katy Vaughan
Most tech companies now have polices aimed at countering terrorist and violent extremist content (TVEC) on their platforms and services. It is also a condition of GIFCT membership that companies must have policies that “explicitly prohibit terrorist and/or violent extremist activity.” Therefore, tech companies are required to explain to users what content constitutes TVEC and will therefore be subject to content moderation.
Yet, where policies exist, many tech platforms, companies, and other actors lack a coherent and consistent approach to defining TVEC that aligns with relevant sources of law and human rights standards. This may impede the proper application of content moderation policies about suspected TVEC on companies’ platforms and services. Among the founding members of GIFCT, only one platform (Meta) has a definition of terrorism, with the others relying on designation lists and definitions of violent extremism.
Reliance on international and domestic terrorist designation lists to define TVEC presents the danger that such policies will reflect, “broader discrimination and bias in the counterterrorism field” – specifically disproportionately focusing on self-declared Islamist terrorist organizations and not right-wing extremist groups. This in turn can have a disproportionate effect on Muslim and Arab communities. There is also the difficulty in identifying right wing terrorist groups or organisations as they can be “organizationally and ideologically fragmented”, which questions the effectiveness of designation.
Utilising designated terrorist organisations as the basis for identifying content may provide certainty for companies, however there are inherent difficulties with the processes used to designate groups as terrorist groups. The lack of due process and transparency are well documented, particularly at the international level including listing procedures carried out by the United Nations. Consequently, the GIFCT Legal Framework Working Group 2022 recommends that tech companies should define terrorism, and not rely solely on list-based approaches.
One of the objectives of the Legal Frameworks Working Group was to identify the level of interoperability that currently exists between a broad sample of definitions of terrorism and violent extremism (TVE). Definitions were collected from international and regional intergovernmental instruments, domestic statutory definitions, the publicly available policies of tech companies, and the model definition of terrorism put forward by the Special Rapporteur on counterterrorism and human rights.
The final output shows the possible implications for companies of the current level of incoherence in relation to defining TVEC. Where there is incoherence that impacts significantly on human rights, it proposes that coherent norms need to be developed. Alongside this, GIFCT has recently launched its definitional tool to assist companies in understanding and applying definitions of TVE.
Advantages of interoperability
Whilst GIFCT itself creating a shared definition of terrorism is understandably met with scepticism, the recent GIFCT Human Rights Impact Assessment recognized a value in creating a common understanding of TVEC. It is argued here that that it is in the interests of GIFCT member companies to promote a move towards greater interoperability with approaches to defining TVEC.
The advantages in greater interoperability include:
- Pushing back against existing overly broad definitions of TVE.
- Acting as a safety net against government definitions that present risks to the promotion and protection of human rights.
- Help to companies in the face of increased state regulation.
- Increasing community trust by being clear and precise with users of platforms and services about what content crosses the line, showing that tech companies’ policies reflect legal consensus.
Definition of violent extremism
This study found that at international level there has been little attempt to define violent extremism in legal instruments. It follows that vague and broad definitions have emerged at national level. Where available, this study considered some national definitions and the platform definitions for violent extremism. It was found that coherence was evident between these definitions that violent extremism involves acts of violence. However, beyond this no further detail or clarification is provided as to the level or potential range of harms. This has the potential to be too broad, as it does not limit the threshold to serious violence. Moreover, definitions differed as to whether they included a motive requirement (for example, political, religious, and ideological purposes), and whether they included as a requirement a target beyond immediate victims (such as the general population).
This raises the important question about companies developing policies aimed at countering violent extremist content, in addition to terrorist content. Further research would be welcome here as to whether content currently removed as violent extremist content would also be removed as terrorist content, or under companies’ existing policies for hateful or violent content.
The study also found that some sources of law for violent extremism exist but are not referred to as violent extremism legislation. However, violent extremism seems to have emerged as an important category for platforms in response to the difficulties and complexities of navigating terrorism law. Consequently, the mapping of sources of law for violent extremism (beyond terrorism) would be an important further phase of analysis to ensure that categories and definitions used by tech companies align with the law. The development of policies which have the potential to negatively impact people’s lives based on a term which has little legislative basis can be even more dangerous for human rights, than the term terrorism.
Definition of terrorism
Examining the definitions of terrorism revealed clear coherence and areas of divergence as to some core components/standard features of a definition of terrorism. An act of terrorism involves an act of violence, carried out intentionally, with the purpose of impacting a specified target which includes members of the general population.
However, the study identified many inconsistencies within these core requirements. Layers of incoherence are evident in the level and range of harms resulting from the act of violence, the level of intent required, and the range of targets and the impact of the act of violence on those targets. There is also divergence evident across the definitions as to the existence of a motive requirement, and what motivations that should include and, the existence of express exemptions such as for protest, advocacy, industrial action, and dissent; and, an armed conflict/international humanitarian law (IHL) exemption.
This is problematic as actions and content can be classified as terrorist in one jurisdiction or on one platform but not another. Previous research has identified the example of where “a group carried out attacks against infrastructure without the intent to harm civilians (by releasing warnings)”. This would fall within some of the definitions subject to this study but not all. Some companies could judge this as not meeting its criteria for TVEC, but other companies may opt for removal. Increased pressure to remove TVEC, may result in some companies erring on the side of caution in these circumstances which can lead to over censorship. Increasing coherence between definitions would be beneficial, but in seeking a common understanding it is important to avoid arriving at the lowest common denominator in a definition which is not compliant with human rights standards.
With this in mind, the output recommends minimum standards in the core requirements of a definition of terrorism. Acts of violence should constitute pre-existing criminal offences, either enacted for the purpose of compliance with an existing treaty against terrorism or identified as a serious crime in national law. The range and level of harm caused by the act should be restricted to those that cause death/endanger life, cause serious bodily injury, or involve hostage taking or kidnapping. The purpose of the act is to impact a target being a wider audience beyond the immediate victims, including the population, the government, or an international organisation. Proof of intention is necessary. Definitions should take a cumulative approach to intention including a general intentional primary act (of violence), and a specific intention to accomplish the purpose of impacting the target. The specific intention should be qualified as to intimidate, coerce, or compel.
Definitions of terrorism present the danger of applying to conduct that does not constitute a terrorist act. Consequently, definitions should include express exemptions such as protest, industrial action, advocacy, and dissent; and exclude activities carried out during armed conflict as determined under international humanitarian law. To avoid the application of TVEC moderation policies to individuals in oppressive regimes it would also be beneficial to limit definitions to violence against non-state actors. In addition, it is proposed that definitions should not include a religious motive as its inclusion can directly fuel the misconception that Islamic religiosity causes terrorism or that the Qur’an generally radicalises Muslims.
Conclusion and recommendations
The output concludes that tech companies should clearly define terrorism, and not rely solely on list-based approaches, identifying the advantage of moving towards greater interoperability with respect to definitions of terrorism.
It recommends:
- The minimum standards identified should be included within the core requirements/standard features of a definition of terrorism.
- Definitions should expressly exclude protest, industrial action, advocacy, and dissent; and activities carried out during armed conflict as determined under international humanitarian law.
- Religious motives should not be included in definitions of terrorism.
- Further research is needed into the value of companies defining violent extremism. This would include transparency in outcomes and procedures of TVEC removed as being violent extremist but not terrorist.
Katy Vaughan is a Lecturer in Law and member of the Cyber Threats Research Centre at Swansea University. On Twitter @KatyVaughan7.
Image credit: Pexels.