By Matteo Vergani, Alfonso Arranz, Ryan Scrivens, and Liliana Orellana
Prior research has found that the COVID-19 pandemic triggered a wave of online hate speech against targets such as Asian and Jewish minorities, as well as an increase in conspiracy theories circulating online. We contribute to this growing body of research with new evidence from a mixed-methods study of the largest and most active Italian conspiracy Telegram channels, La Cruna dell’Ago. We combine digital ethnography with automated language analysis methods to study all messages written in the La Cruna dell’Ago Telegram channel from its inception on February 7, 2020, until December 31, 2020. We assessed whether the language changed across different periods of the country-specific responses to the virus: (1) March 9, 2020 (the beginning of the first lockdown); (2) May 19, 2020 (the reopening for the summer months); (3) November 6, 2020 (the beginning of the second lockdown). The study period covers the first year of the pandemic in Italy: the first Western country to experience a significant number of infections in early 2020 immediately after the virus was detected in Wuhan.
Firstly, we detected significant changes in the topics discussed during the first year of the pandemic. Figure 1 shows the main results of our topic modelling analysis, which we conducted using top2vec. Two main findings are noteworthy. We found that 1) the discussions about ‘illnesses’, which revolved around the health issues associated with COVID-19, clearly peaked just before the beginning of the first lockdown; and 2) discussions about ‘US politics’ peaked just after the beginning of the second lockdown. Our analysis identified an additional six other key topics of discussion (which we named ‘banks and finance’, ‘religion’, ‘social control’, and ‘elites and conspiracies’), which encompassed about 88% of discussion. The increasing level of discussions about the US election suggests that it was a significant issue on the agenda of channel users. Future research should investigate the potential ‘contagion effects’ from US-based to foreign conspiracy communities, as well as the impact of US political themes on the political attitudes of foreign communities (for example, whether Trump’s ‘stolen election’ campaign contributed to delegitimize democratic processes among Italian conspiracy theorists).
Secondly, our study detected a general increase in hate speech in the Italian-themed Telegram channel over the course of 2020. Our model classified the user’s speech into four classes: acceptable, inappropriate (containing swearing that is not abusive in nature), offensive (containing degrading and insulting language targeting a group) and violent speech (containing references to violence against a group). Figure 2 shows that the levels of inappropriate and offensive increased over the course the year. This finding is consistent with previous research, which has shown that the COVID-19 pandemic triggered a general increase in online hate speech.
Thirdly, we found that the users attacked different hate targets over the course of 2020. Figure 3 shows a striking increase in the frequency of discussions about domestic hate targets such as nurses, doctors, and journalists just before the second lockdown. Importantly, discussions about foreign-focused adversaries such as China and Wuhan were most frequent in the early stages of the pandemic, specifically in February 2020. For comparison, we also measured the frequency of discussions about hate targets associated with other conspiracy theories popular in the channel, including QAnon (e.g., paedophiles, Obama, Clinton) and the New World Order (NWO) (e.g., Jews, banks, finance). They are reported in the figure, and their relevance remained broadly stable across the study period.
While previous research has mainly focused on hate speech directed towards a particular group, such as Asian or Jewish minorities, our study found that hate speech targeted various groups with different intensity at different points in the first year of the COVID-19 pandemic. Based on our digital ethnography, we propose an original interpretation of the data: when Telegram channel members perceived COVID-19 as a significant threat, hate speech was primarily targeted towards China, reflecting a conspiratorial narrative that the virus was intentionally created by the Chinese government to attack the West. As the channel’s discussions became less focused on health concerns, hate speech shifted towards targeting journalists and health workers. This reflected a conspiratorial narrative that the pandemic was a hoax perpetuated by domestic elites to justify draconian measures for social control. Within this narrative, journalists, doctors, and nurses were accused of being complicit by spreading allegedly false information about the risks of COVID-19 and the collapse of the healthcare system. Taken together, our findings show that hate speech is not only a product of individual prejudice but also reflects broader political and social dynamics.
By uncovering that the discussions in the Telegram channel shifted their main focus from foreign adversaries (i.e., China) to domestic adversaries (i.e., journalists and healthcare workers) between March and November 2020, our findings suggest that the conspiracy theories circulating in online communities can adapt to the social and political context over a relatively short period of time (i.e., months). Additionally, by showing that elements of multiple conspiracy theories (including foreign focused, domestic focused, QAnon and NWO) coexist with importance that varied significantly over the course of 2020, we showed that the culture of online communities of conspiracy theorists is a dynamic set of ideas made of moving parts that gain (or lose) dominance over time. In the Italian-themed Telegram channel, numerous conspiracy theories coexisted simultaneously and circulated during the pandemic in 2020.
Online communities, such as the one we studied, are known for harbouring polarising opinions. Members who subscribe to conspiracy theories tend to adopt a siege mentality. In our study, we observed that the Telegram channel’s discussions centred around the theme of ‘social control’ with users passionately debating issues related to censorship and perceived control by opaque conspiratorial forces. This siege mentality, combined with a sense of isolation from mainstream society, creates an environment that is conducive to the sharing of false and implausible content, which is then spread uncritically. Whether this process of polarisation and conspiracy theorising could lead to real-world harm against the hate targets (e.g., healthcare workers and journalists) remains an open question.
For more on these findings and the nature of the study in general, we encourage you to read the full manuscript which was recently published in Social Media + Society.
Matteo Vergani is a Senior Lecturer in Sociology at Deakin University, and Senior Research Fellow at the Alfred Deakin Institute for Citizenship and Globalisation.
Alfonso Martinez Arranz is a researcher at the Faculty of Engineering and Information Technology of the University of Melbourne.
Ryan Scrivens is an Assistant Professor in the School of Criminal Justice at Michigan State University. He is also an Associate Director at the International CyberCrime Research Centre and a Research Fellow at VOX-Pol.
Liliana Orellana is Professor in Biostatistics in the Faculty of Health at Deakin University.
Image Credit: Freepik
Want to submit a blog post? Click here.