By Jonathan Pieslak
On 14 June 2024, the Anti-Defamation League (ADL), a US-based “anti-hate” organization focusing primarily on antisemitism, posted a provocatively entitled article, “GAI Music Creation Tool Suno Has Been Weaponized to Promote Hate”. The report highlights the ways in which Suno, a generative artificial intelligence (GAI) music-creation platform, can be exploited to create songs embedded with coded lyrics and “dog whistles” promoting hateful attitudes. While it raises legitimate concerns about how GAI might be misused to create offensive content, the article’s scope and aim is rather introductory in nature. The following post explores more deeply how GAI music operates in the domain of online extremism.
An Overview of GAI Music and the Music Industry
Assessing the impact of GAI music begins with understanding the context of the technology. Suno and its primary competitor, Udio, transformed the music landscape in 2024, releasing tools for modeling music in which users generate songs through simple text prompts. While AI is often accompanied by sensationalized claims about its impact and potential, the technology is indeed impressive.
Like Napster in 1999 and the digital-music revolution thereafter, GAI music appears to be fundamentally reshaping the industry, sending “rattles” throughout the music ecosphere as artists and record companies scramble to react. In fact, only a week after the publication of the ADL report, three giants, Universal Music Group, Sony Music Group, and Warner Music, filed a collective lawsuit against Suno and Udio, alleging “massive infringement of copyrighted sound recordings” on which their GAI models were trained. These three companies, representing over 70% of the global recorded music market, clearly perceive a present and future danger.
The major record labels, though, are far from being motivated solely by the noble intent to protect their artists. Suno and Udio developed the technology faster and in ways that threaten and compete with the music industry’s business model for profitizing their own artists through AI. Previous computer-automated music-generation tools never produced anything that could remotely compete with actual recordings, but now that they can, the industry is attempting to smother the competition.
The settling of the dust from this lawsuit will likely take years, and in the meantime the technology will only improve and become more widely available. Technology had always outpaced regulation and legal action, and one might (or should) anticipate that the aspirations of tech will again prevail over music artists and the music industry. It is hard to put genies back in bottles.
GAI Music and Extremism
The concern expressed in the ADL report centers upon how Suno can be manipulated into producing music with offensive, especially antisemitic, song lyrics. Through misspellings and obviously coded words and phrases, users have created songs like, “I cant wait to dye fer israel”, “Squatting for Hitler” (about exercising for racial salvation) and “My Little Chamber” (about Nazi gas chambers). There is always room for improvement in content moderation, and Suno is well-advised to continue refining its ability to prevent attempts at offensive song lyrics. That said, the platform is not an “open” forum and remains fairly adept at regulating violent and hateful song lyrics.
A closer look at the context of this music is revealing. To start, the use of GAI music is almost exclusively limited to the world of non-organizational, online right-wing and white-nationalist extremism, circulating among a small collection of websites, 4chan and Telegram, and to a lesser degree Stormfront and Kiwi Farms.
The interest in these songs is also quite limited, with a relative fraction of users creating or listening to it. Stormfront, for example, has two distinct threads on Suno and AI-generated music. The first started the day the lawsuit was filed and was active for a little over two months; it amounted to a back-and-forth between two users. The second, “Your AI Generated Songs”, began in mid-September and was active for less than a month; this involved only three users (one was from the first thread). A search within Suno’s catalog likewise demonstrates limited interest in this kind of music, the suno-created song, “maybe hitler was right” has, to date, only 158 plays. The technology may have been “weaponized”, but the breadth of its reach and influence appears modest at best.
Will GAI music, given time, come to represent something more in the context of extremism than its current form of offensive parody and racist mockery expressed through coded language? Perhaps. It is doubtful, though, that organizational extremism will ever come to accept GAI music as truly emblematic of its causes.
The legitimacy and authenticity of the organization is inherently challenged by the artifice presently associated with GAI music. The music does not (yet) possess potency or currency among a broad cross-section of online extremism participants, or general music audiences for that matter. This is not to dismiss the genuine hate spewing from much of this music, but “Squatting for Hitler” will never carry the same meaning as the music of Skrewdriver for no other reason than it was created by a machine. Such realities reinforce the larger point often made in extremism research that it is the social connection of human-to-human interactions that define and motivate the advancement of hateful ideology.
Looking at the jihadi world, there does not appear to have been music or recitation produced using GAI. Using GAI music to produce jihadi propaganda would represent a tremendous risk that, if uncovered, would almost certainly stigmatize a group as illegitimate. Seeing as most jihadi groups have strong political and state-building ambitions (or so they say), it would raise questions about resources and authenticity if the group circulated official propaganda that was AI-generated, especially among the traditionalist and ultra-conservative jihadis. The future of GAI music in the jihadi world will likely occur (if at all) in online chatrooms with “ji-hobbyists” and armchair individuals not affiliated with jihadi groups, much like it does presently in the far-right and white-nationalist sphere.
Even so, we should not diminish or ignore its potential. Clearly, the music industry is trying to stifle GAI music (until it can replace it with its own). One popular YouTuber’s reaction to an AI Drake song, ‘Winter’s Cold’, was to call him out, ‘Drake, this might be the best song you ever made…that you didn’t make.’
And from the world of extremism, a Stormfront user remarked, ‘I’ll imagine a song I wrote, sung a certain way, with a certain kind of music, and think “if delivered just like that, people might like it.” But the AI takes it and makes it sound a million times better than I ever imagined it could sound…Artists, musical and otherwise, express the soul of humanity. Now we’re being outdone by an idiot machine. In an instant, at the push of a button. Isn’t that terrifying?’ Maybe, but until GAI cultural products (music, literature, etc.) are warmly embraced within global culture as true art, creatively indistinguished from human-produced art, most organizational forms of extremism are likely to refrain from using it.
Jonathan Pieslak is Professor at The City College of New York and Graduate Center, CUNY. He specializes in the cultural dimensions of extremism. jonathanpieslak.com
IMAGE CREDIT: PEXELS
Want to submit a blog post? Click here.