By Amy-Louise Watkin
This article summarises a recent paper published in Studies in Conflict & Terrorism that forms part of a special issue on the practicalities and complexities of (regulating) online terrorist content moderation. The special issue contains papers that were presented at Swansea University’s Terrorism and Social Media Conference 2022.
Tech platforms have already been through several regulatory phases concerning the countering of terrorist and extremist content on their platforms: from the complete lack of regulation, to self-regulation, to the present day where regulation has been implemented by governments around the world in an increasingly complicated and fragmented manner. While much of this existing regulation has been widely criticised (see full article for more on these criticisms), it still seems to be inspiring similar regulation in other countries. One of the main criticisms of regulation to date are the effects on smaller platforms with concerns of creating unfair burdens on these companies.
Defining Smaller Platforms
Before delving into a summary of the article, an important question is whether all the relevant parties involved in the regulation of tech platforms are defining “small platforms” in a way that is both functional and consistent. Germany’s NetzDG law was one of the first acts in this field to differentiate compliance based on the size of the platform and was implemented as only applying to platforms that had more than 2 million registered users in the Federal Republic of Germany. The UK’s Online Safety Bill is a more recent example. It splits platforms into category 1 or category 2 services and indicated that platform size was a determining factor in making this decision when it said that “this tiered approach will protect freedom of expression and mitigate the risk of disproportionate burdens on small businesses.” However, how platform size is determined within the bill is not clear.
Therefore, within this regulatory sphere, there does not appear to be a universally agreed definition as to what is considered a “small platform.” Despite this, regulatory demands continue to differentiate what is required from the industry based on “platform size.” Where “size of userbase” is provided as the definition of what makes a “small platform,” the reason for the number of users chosen is rarely clear and often appears to be arbitrary. This is adding further complexity and confusion to the regulatory landscape.
This article is inspired by the work in my PhD thesis, and while I was writing my thesis, two questions played on my mind. The first is why the same critiqued regulatory approaches are being implemented again and again, by different governments. The second is what regulatory approach will best minimize potential unfair burdens on smaller platforms. I researched two regulatory approaches that I had not yet seen discussed in the context of countering online terrorist and extremist content, these were social regulation and responsive regulation. This article discusses the findings and recommendations of my research into the latter, which proposes an approach that moves away from categorizing platforms based on the size of their userbase, and focuses instead on categorizing platforms based on compliance issues that are faced in the industry.
Responsive Regulation
Ayres and Braithwaite have written thoroughly about responsive regulation and argue that it should be responsive to specific industry structure because different structures are conducive to different degrees and forms of regulation. They have also written about the importance of being attuned to the differing motivations of regulated actors to comply with regulation because the most successful regulation will speak to the diverse objectives of the regulated firms. Finally, a responsive regulatory approach should take neither a solely deterrent nor solely cooperative approach.
The implementation of a responsive regulatory approach in other industries led to many academics categorising the different types of companies within the industry of their focus in relation to compliance issues. For example, Kagan and Scholz created several categories of companies to explain non-compliance in their field. These categories included “amoral calculators” which are companies who use a risk-benefit analysis to make compliance decisions; “political citizens” which are companies who decide not to comply because they do not agree with the rules; and “organisationally incompetent” companies who fail to comply due to lacking sufficient management and systems. An advantage of categorising companies is that it can be useful for improving and tailoring regulation to the needs of the companies in the different categories, therefore increasing overall compliance across the industry.
Awareness, Capability and Willingness
This research created a number of categories of tech platforms based on issues that they face in trying to comply with regulation that seeks to counter terrorist and extremist content on their services. It must be acknowledged that this is not an exhaustive list of potential categories, but rather, makes a start to thinking about the compliance issues that are faced by tech platforms in this field from a responsive regulatory approach.
The first category are tech platforms who struggle with regulatory compliance because they lack the necessary awareness and expertise. This could be regarding a number of different things, for example, awareness of what regulation they are meant to comply with, awareness of how terrorists exploit platforms like theirs, or the expertise required to undertake the actions the regulation demands of them.
It must be noted that neither this category or any of the other proposed categories are static. Terrorist and extremist groups are highly adaptable and therefore, a platform may find that where it once held the necessary expertise to fulfil compliance, it suddenly requires assistance to keep up with terrorist groups ever-evolving strategies.
The second category are platforms that lack the capacity and resources needed to comply. The resources that are required may differ between platforms depending on a number of factors, including the number of users a platform has, the volume of content it hosts, and the extent to which and in what ways the platform is exploited by terrorist groups. This includes but is not limited to financial, technical, and human resources.
The final category are platforms that lack willingness to engage with regulation, and will therefore not demonstrate reasonable efforts towards compliance. This may be because the platform has decided that the regulatory demands are at odds with its mission and values, and complying could result in losing its brand identity and userbase. This compliance issue differs from the previous two categories. For the previous two, platforms and regulators could work together to overcome the compliance issue, however, with this category, it is unlikely that the platform will voluntarily agree to work with a regulator to resolve the issue.
There is a fourth and final category: platforms that do not face any of these compliance issues. These platforms have the awareness and expertise, capacity and resources, and willingness to comply with regulation to counter terrorist and extremist content on their services.
Educative Approach
Within the responsive regulatory literature, there is discussion around the need to move away from a solely punitive regulatory approach. In situations where the regulated actor is willing to comply, however faces other compliance issues (like those in the awareness and expertise, and capacity and resource categories), responsive regulation argues that an educative approach is better suited. This approach is based on advice-giving and training, with the enforcer playing the role of consultant, seeking to educate and provide assistance. Responsive regulation recommends exhausting these educative options before resorting to punitive enforcement action. This approach is thought to help companies make sense of what is required of them to comply, and will not disadvantage smaller platforms.
Four Regulatory Tracks
The full article proposes four regulatory tracks, each one tailored to overcoming the three identified compliance issues of lacking awareness and expertise, capacity and resources, or willingness. The fourth regulatory track proposes a regulatory approach for those companies who fall into the category of not facing any of the three compliance issues. The article also proposes the use of an enforcement pyramid to enforce the mix of an educative and punitive approach; educative for those platforms who are willing to engage and receive help in overcoming their compliance issues, and a punitive approach for those platforms who fail to display a willingness to do so.
Dr Amy-Louise Watkin is a lecturer in Criminal Justice at the University of the West of Scotland. Amy-Louise’s research interests are online terrorist and extremist propaganda and regulating online spaces.
Image Credit: Freepik
Want to submit a blog post? Click here.