By Tom Ascott
Everyone has seen a meme, whether they know it or not. They’re everywhere on Facebook, Twitter and Instagram. The most popular ones make it off the Internet and show up in newspapers, television shows or films. You’ve almost certainly seen Pepe the Frog, and if you haven’t seen the classic ‘Woman yelling at cat’ (2019), sooner or later you will. Despite seeming like nothing more than silly phrases written on cartoons, memes are becoming a cultural touchstone.
Memes might appear as if they’re just little pranks kids play online, yet they’re anything but. In the 2016 US presidential election, many memes were made by a Russian troll farm to influence the outcome. It wasn’t an isolated incident, either—now troll farms are popping up more frequently. In November last year, an undercover reporter revealed a new one in Poland. The memes it’s producing focus on ‘the aviation and defence industries, and target key decision-makers involved in the awarding of major government defence contracts’. The memeing, it has transpired, was political.
Those memes were a form of information warfare, or what would have been thought of in the past as a psychological operation. Information warfare is often about waging an influence campaign that can change behaviour through ‘unconventional means’, such as social media. Memes play a part in a specific type of information warfare dubbed ‘memetic warfare’.
Early memetic warfare used a more ‘Dawkinsian’ concept of memes. In his 1976 book The Selfish Gene, Richard Dawkins coined the term ‘meme’ for a cultural product that can replicate and spread. The meme concept had immediate and clear implications for information warfare campaigns.
Meme warfare now more often refers to using memes as individual weapons of information warfare. It’s a form of disinformation that can be used to secure strategic goals. Disinformation campaigns go to back to at least 1923, when the Soviet Union had an office for ‘dezinformatsiya’ campaigns—a term coined by Stalin to describe ‘false information carefully constructed with the intention to deceive’. The Internet has ushered in an age when deception can be perpetrated on a mass scale, with the click of a mouse.
The West is desperately lagging in its memetic capability. US Marine Corps Major Michael B. Prosser proposed that NATO open a meme warfare centre. In his 2006 thesis, he looked to Dawkins’s ideas of memes as units of cultural transmission that held the potential to ‘be used like medicine to inoculate the enemy and generate popular support’. He noted that information operations, psychological operations and strategic communications weren’t using memes effectively. In the following decade, NATO never did open a meme warfare centre, but the idea didn’t go away and is now starting to gain traction again.
At the same time, across the pond, DARPA funded Dr Robert Finkelstein to investigate how it might weaponise memes. In his presentation to the Social Media for Defense Summit, Finkelstein recommended memetic warfare for all public affairs units, the Central Intelligence Agency and the US Army Civil Affairs and Psychological Operations Command.
It wasn’t until 2017 that the EU and NATO established the European Centre of Excellence for Countering Hybrid Threats. While understanding memes as information warfare falls within its remit, its purpose is more analytical than proactive. Currently, it seems, the best form of defence is awareness.
Memes resemble traditional propaganda in a few ways; a hostile government can use them to spread malicious information in a way that’s advantageous to it. But there are key differences, too. Because memes are a common way for people to express themselves online, it’s very easy to make memes without their being suspected as pieces of information warfare. And they can be much more targeted. Traditional propaganda focuses on large groups of people who have some spatial or political link. The way the Internet ‘fingerprints’ users allows hostile actors to draw up their own lists, looking for links even users don’t know they share.
Authoritarian regimes may be less susceptible to memetic warfare because they have more control over their social media systems. China employs ‘20,000–50,000 Internet police and an additional quarter-million “trolls” who spread pro-Beijing material domestically and abroad’. Memetic warfare isn’t just military, but civil as well. Some 1,200 TikTok channels are run by various Chinese civil authorities such as ‘police stations, SWAT teams, traffic police, and prisons’. They’ve produced more than 13,000 videos and amassed a combined 4.8 billion views. It’s a domestic attempt to control the online narrative by the security services. It’s unlikely that a meme (such as that Chinese police are responsible for assassinating high-profile individuals) will spread there, whether produced internally or by the West.
Memes also can have dual functions. As well as acting as a form of information warfare, they can help normalise extreme behaviours through humour. For example, memes can be recruitment tools for white nationalist groups. The Christchurch mosque terrorist, Brenton Tarrant, frequently used 8chan. All chan sites require users to submit images with text. As a result, these right-wing sites are a fertile breeding ground for memes that normalise extreme behaviour and reinforce each other. Slowly, the memes seep out into more common areas of the Internet, such as Twitter. That everyone on 4chan should be considered a terrorist is itself already a meme on 4chan.
One of the ways that information warfare can be fought is simply by raising awareness about issues. Ironically, since it is not the youth who are most radicalised by social media, there needs to be a concerted effort to make people more aware that even memes can be harmful. Awareness that a stranger on Twitter may be a bot, a paid troll or a shill hasn’t stopped hostile actors from achieving their goals. They’re not trying to win debates online by making convincing arguments but to ‘amplify the loudest voices in that fight, deepening the divisions between us’.
Tom Ascott is the digital communications manager at the Royal United Services Institute. You can follow him on Twitter: @Tom_Ascott
This article was originally published on Australian Strategic Policy Institute (ASPI) website. Republished here with permission from the publishers.