The rise of social media usage as an everyday activity for millions of citizens has been accompanied by a discussion about the dangers of this development. Echo chambers or “filter bubbles” are often mentioned in this regard.[1] Both concepts refer to the possibility that social media users only engage with content that confirms already-held beliefs. The rise of virtual networks, in which ideological frames are amplified and made disproportionally salient are constantly reaffirmed. Because users engage only with content confirming a specific narrative and only interact with those holding similar beliefs, social media creates the illusion of widespread agreement. This can lead individuals to become trapped in a network that exchanges extremist content, experience more intense emotional reactions, and become polarized. It can also contribute to radicalization processes, [2] for instance, in the case of ISIS.[3]
Nevertheless, echo chambers are a contested concept. Some have criticized the notion that the very design of social media platforms supposedly connecting all users intrinsically leads to secluded online communities and the development of echo chambers or filter bubbles. Some of these criticisms reject the assumption that smart algorithms behind the design of such platforms make it more likely that users are shown extremist content through personalized recommender systems suggesting new content based on previous user preferences.
The evidence is mixed and largely depends on the type of platform analyzed. For instance, a recent study found that neither the recommender systems on Reddit nor Gab led to exposure of more right-wing material after users had viewed such content.[4] On YouTube, however, recommender systems prioritized right-wing material and suggested this type of content more often after users had engaged with it initially.[5] While algorithms certainly shape the content users see, their influence on creating extremist filter bubbles largely depends on the individual design of the social media application and should not be generalized. Others have criticized the concept of echo chambers more generally. For instance, it has been argued that because we are “networked individuals” embedded in multiple networks and with at least weak ties to those outside of a potential echo chamber,[6] the supposed threat cannot be as significant as proponents of the concept argue.[7] After all, it is not a real echo chamber if there is input from the outside.
However, an echo chamber may also be understood as a cognitive property shared by members of a virtual “imagined community” rather than a function of social media itself.[8] Here, echo chambers are related to the concept termed by psychologists as confirmation bias which describes the human tendency to seek out information which confirms already-held beliefs and disregards information contradicting these views to avoid cognitive dissonance. In other words, just because today’s “networked individuals” have ties to multiple networks and therefore cannot be said to be structurally confined to an echo chamber, they may nevertheless be confined to a psychological or cognitive echo chamber as they do not pay equal attention to the information brought forward by their ties outside of the secluded echo chamber network. An echo chamber must not necessarily be understood as a room without doors, but rather as a room in which only certain doors are opened while others are kept shut by the social media users themselves (an often subconscious process).
Echo chambers can also be understood as entities transcending individual cognition. While users may subconsciously create their own echo chambers by selecting to follow and consume content from sources in accordance with their worldview due to confirmation bias, they would not be able to do so without users producing and sharing this type of content. An echo chamber may, therefore, have two levels: one describing a certain network of individuals or organizations postulating similar frames and narratives due to a similarity in worldview and secondly, as a result, a customized cognitive echo chamber unique to each actor and created by the users themselves.
Not one echo chamber in individual cognition will be the same because even users within the same network and echo chamber make slightly different choices regarding whom to follow and engage with due to differences in personal preferences or habitus.[9] The collective echo chamber may be characterized as both constructivist and interactionist. It is constructivist in nature, because multiple actors produce and share content whereby individual meaning assigned to the content is transformed into an overarching production of meaning existing outside and, through the storage function of the internet, ultimately independent of individual actors. It is also interactionist, because the narratives and identities derived from the collective meaning production are then fed back into both the individual and shared echo chamber, ultimately shaping and re-shaping the discourse.
Social media is not necessarily the cause of such filter bubbles, but both the users’ ability to customize the content they see, and the interconnectedness associated with the use of social media can facilitate the emergence of echo chambers. Online, we can choose our own “tribe” and the people we want to engage with. By way of customization, social media enables us to create our own reality with selected friends, news, jokes and even political reality without having to turn on the TV or read a newspaper, which might potentially contain information causing cognitive dissonance. In essence, the very thing designed to connect us in a globalized world enables us to eliminate the parts of this world we do not want to see.
When Bateman, admittedly in a colloquial and generalizing manner, stated,[10] “once every village had an idiot. It took the internet to bring them all together”, he was referring to the interconnectivity ingrained in social media applications but also the increasing tribalization of our otherwise individualized postmodern society. Online, users choose the network they wish to belong to and if this network moves in an extremist direction, so may its members. However, it must be emphasized that both echo chambers as individual cognitive states and echo chambers as collective meaning-production entities do not come into existence in a vacuum. Echo chamber development is a process and the individual is likely to go through various stages of “captivity” in the chamber, starting with simple interaction and weakening of outside ties and leading to complete immersion.
If we accept that echo chambers exist both in individual cognition and as a collective entity of meaning production, potential counter measures are difficult to come up with. It will not be enough to simply change personal recommender systems and the corresponding algorithms to fight self-produced echo chambers. Additionally, customization and being able to choose the content one consumes and “follows” is part of the appeal of social media and to diminish this possibility will likely lessen the appeal and quality of the user experience on social media for millions of non-extremist users. While counter measures should be approached from all angles, it might be useful for academics, practitioners and policy makers to look beyond echo chambers and explore other avenues for CVE and PVE programs in the online realm.
REFERENCES
[1] Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin: London
[2] Aday, S., Freelon, D. and Lynch, M. (2016). How social media undermined Egypt’s democratic transition. Retrieved from: https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/07/how-social-media-undermined-egypts-democratic-transition/
[3] Shane, S., Apuzzo, M. and Schmitt, E. (2015). Americans Attracted to ISIS find an ‘Echo Chamber’ on Social Media. Retrieved from: https://www.nytimes.com/2015/12/09/us/americans-attracted-to-isis-find-an-echo-chamber-on-social-media.html
[4] Reed, A., Whittaker, J., Votta, F. and Looney, S. (2019). Radical Filter Bubbles: Social Media Personalization Algorithms and Extremist Content. Retrieved from: https://rusi.org/sites/default/files/20190726_grntt_paper_08_0.pdf
[5] O’Callaghan, D., Greene, D., Conway, M., Carthy, J. and Cunningham, P. (2015). Down the (White) Rabbit Hole: The Extreme Right and Online Recommender Systems. Social Science Computer Review. Vol 33 (4), pp. 459-478
[6] Granovetter, M. (1977). The Strength of Weak Ties. American Journal of Sociology. Vol. 78 (6), pp. 1360-1380
[7] O’Hara, K. and Stevens, D. (2015). Echo Chambers and Online Radicalism: Assessing the Internet’s Complicity in Violent Extremism. Policy & Internet. Vol. 7 (4), pp. 401-422
[8] Anderson, B. (1991). Imagined Communities: Reflections on the Origin and Spread of Nationalism. (2nd edition). Verso: London
[9] Bourdieu, P. (1994). Structures, Habitus, Practices. Retrieved from: http://isites.harvard.edu/fs/docs/ icb.topic1458086.files/Bourdieu_structure%20habitus.pdf
[10] Bateman in Singer, P. and Brooking, E. (2018). LikeWar: The Weaponization of Social Media. Houghton Mifflin Harcourt Publishing Company: New York, p. 126
Linda Schlegel holds an MA in Terrorism, Security and Society from King’s College London and a BA in Liberal Arts from the University College Maastricht. You can follow her on Twitter: @LiSchlegel
This article was originally posted on European Eye on Radicalization website. Republished here with permission from the author.