Feature

How journalism tiptoes around algospeak’s grocery store  

From “seggs” to “unalive,” online language is evolving to bypass automated moderation. What started as a workaround for content creators is now changing journalism itself. This is the story of algospeak: what it is, who uses it, why it’s needed, and what it hides. We spoke to audience development expert, Erika Marzano, and media sociologist, Dr. Daniel Klug, to find out.

Illustration: Evgenios Kalofolias


1 Steen et al. (2023), You Can (Not) Say What You Want: Using Algospeak to Contest and Evade Algorithmic Content  Moderation on TikTok, Social Media + Society, July-September, openly accessed here.

Have you scrolled on TikTok and bumped onto a rather weird typo of “sex”? What is “seggs”, you might ask. Well, what you might have seen is an example of algospeak. When interviewed by iMEdD, Dr. Daniel Klug, Senior Multimedia Production lecturer at the University of Applied Sciences of the Grisons, defined algospeak as “the intentional misspelling, shortening, changing, and adjusting of words in text or video captions on social media posts”.  

According to Klug’s research [Steen et al. (2023)]1, algospeak belongs to a wider umbrella of internet-based communication known as “netspeak”. Famous subcategories include “leetspeak”, which replaces letters with numbers and has been used, since the 1980s, to create exclusivity for online gamer communities. In fact, the word “leet” itself is a mispelling of “elite”.  

However, as opposed to other online languages, algospeak is clearly “used to evade content moderation, because the algorithm seems to suppress subjects that are associated with certain keywords”, stressed Klug.   

Gone are the days that user-generated content was reviewed by humans before publishing it on online platforms and websites. Today, content moderation, which, in general, is colloquially referred to as “the algorithm”, is a series of mechanisms that screen online content and remove it if it violates a platform’s community guidelines. Often informed by a country’s legislation, community guidelines are likely to prohibit sexually suggestive and graphic content, abuse, vulgar language and others, even if relevant terminology is used for educational or scientific purposes.  

In any case, “the algorithm” seems to be like a black box. No one knows how it exactly works; probably not even the platforms themselves. So, from the shortening of words to the creation of phonetic similarities to the use of visual symbols and emojis, algospeak serves as a constantly evolving, masterful way of “tricking” the algorithm into not suppressing certain content. 

Algospeak serves as a constantly evolving, masterful way of “tricking” the algorithm into not suppressing certain content. 

Democratizing and dictating free speech, two sides of the same coin 

On the one hand “algospeak is democratizing the act of making up certain terms”, said Erika Marzano, Deutsche Welle’s audience development manager, in another interview with iMEdD. On the other hand, algospeak is the direct manifestation of repression from platforms, something that even media outlets cannot escape.  

One reason could be the sheer volume of content these platforms handle — billions of posts every day. Their moderation systems simply can’t keep up. As a result, platforms often choose to restrict content that includes a sensitive keyword, preferring to deal with appeals later rather than risk allowing something they might have to remove afterwards. Such practices seem easy, but as to whether they are ethical, Marzano commented: “we need to understand that social media companies are still commercial companies. They need to make money”, sometimes at the expense of free speech.   

We need to understand that social media companies are still commercial companies. They need to make money.

Erika Marzano, audience development manager at Deutsche Welle

Nevertheless, freedom of speech suppression is not homogenous. Klug identified in his research some categories that tend to get more suppressed including sex education, sexuality, mental health and even issues around race.  

“Arbitrariness is something that the users experienced”, said Klug. While some videos were banned, other content that was similar or even the same was not. Having this in mind, “users use algospeak as a trial-and-error technique. Is it the textual caption, the audio language or is it the visual aspect that triggers the algorithm?”, he added.  

Dr. Daniel Klug, Senior Multimedia Production lecturer at the University of Applied Sciences of the Grisons, photographed by Petros Toufexis. He has black and grey hair, a beard and brown eyes. He is wearing a dark blue shirt and glasses. He is stood still, with a gray wall and some green plants as his background. He is slightly smiling.
Dr. Daniel Klug, Senior Multimedia Production lecturer at the University of Applied Sciences of the Grisons. Photo: Petros Toufexis/iMEdD

If you don’t see LGBTQ+ content, you might see anti-LGBTQ+ content, which might come with the same hashtags. It’s just that the messaging is different

Dr. Daniel Klug

Marzano attributed the prevalent randomness in content restriction to factors such as audience age, cultural taboos, and national legislation. “Certain governments request the takedown of videos belonging to certain categories; for example, LGBTQ+. In other countries, platforms are completely free, and they themselves even have Pride initiatives,” she stated. 

But these inconsistencies have real consequences for users. They shape what people see online and, just as importantly, what they don’t. This gives rise to issues of visibility and representation among certain communities. “If you don’t see LGBTQ+ content, you might see anti-LGBTQ+ content, which might come with the same hashtags. It’s just that the messaging is different”, explained Klug.  

This way, algospeak can be used to create visibility within a community but only as long as users know how to use it, how to find it, and how to get around all the other content that people are using to create contrasting visibility or conceal hate speech. The algorithm doesn’t know if you are a homophobic bigot or just want to spread awareness about LGBTQ+ exclusion; “it is technically ‘unbiased’, it needs context” concluded Klug. 

The fate of the journalistic message lies in the eye of the “algospeaker” 

“Algospeak has completely changed the way we do journalism on social media”, exclaimed Marzano. Although media outlets still have their own websites and newsletters, where they can speak unbothered, social media platforms are shaping the journalistic message.  

As an audience development manager at DW, Marzano described how algospeak has reshaped her workflow. “I had to train the journalists to use algospeak in how they speak, how they write their scripts, but also the way they search. News is getting more and more sourced online, and if you don’t know how to search certain topics, you’re never going to find them.” 

That challenge doesn’t stop at content creation, but it extends to moderation as well. Comment sections are heavily affected by algospeak, with filters often falling short. A comment containing the word “rape,” for instance, could be inciting violence, or it could simply be a survivor sharing their story. When users instead write “grape,” community managers must be able to recognize the coded term and its context. The rise of algospeak, Marzano underlined, “goes from training journalists, to the role of the community manager, to affecting the full journalistic spectrum.” 

Erika Marzano, audience development manager at Deutsche Welle, photographed by Petros Toufexis. She has ginger short hair and blue eyes. She is wearing a read blouse and has glasses on top of her head. She is stood still with a green plant backdrop. She is smiling.
Erika Marzano, audience development manager at Deutsche Welle. Photo: Petros Toufexis/iMEdD

One might question the effect of algospeak on the content of journalism itself. When it comes to reporting on a sexual crime or a murder and the terms “grape” and “unalive” pop up, is this not downplaying the seriousness of a story? Marzano stressed the importance of transparency. “We are totally open to answering why we use this escamotage to get our videos through”, especially to people who might not be “chronically online” or fluent in algospeak.  

In any case, media outlets cannot afford to not be on social media. According to the 2025 Reuters Institute for the Study of Journalism Digital News Report, 44% of 18-24-year-olds and 38% of 25-34-year-olds, depend on social media and video networks for their news.  

“If we are not on social media, we’re going to lose a big chunk of the audience that doesn’t check traditional channels. At the same time, we know that we have to play the game of the platforms, otherwise our content is just not shown. It’s about learning how algorithms work and trying to adapt without compromising our values”, concluded Marzano. 

I had to train the journalists to use algospeak in how they speak, how they write their scripts, but also the way they search. News is getting more and more sourced online, and if you don’t know how to search certain topics, you’re never going to find them.

Erika Marzano, audience development manager at Deutsche Welle.

(Dis)connecting communities online 

Klug explained that algorithms can “learn” and that users have found ways to “hack” them as they go. Text filters, for example, are easy to outsmart; you can swap a “banned” word for an emoji. When scanning audio, saying “eggplant” might get the message across and people who are intended to see it will know it’s not about groceries. Some even go further, visually recreating the eggplant symbol. “It’s a constant re-adapting and re-negotiation of that practice,” Klug said. 

Still, Klug doesn’t believe in eliminating content moderation altogether, as platforms like X often falsely sell it as free speech. “It would be nice if there were more contextual content moderation. But again, context is not a manifest thing. It’s a social construct. I would say educating the audience is the most important thing,” he noted. 

That lack of contextual understanding extends beyond moderation itself, as it also shapes how people interact with language online. When only a few users understand the coded terms on their feeds, conversations risk becoming insular. According to a recent study, algospeak can create “echo chambers”, rendering social media paradoxically “exclusive” and dividing.  

This concern has been echoed by other creators, including Adam Aleksic, author of “Algospeak: How Social Media Is Transforming the Future of Language”, who noted that “[TikTok] is driving the mass production of identity-building labels to profit off all of us… It [creates] an echo chamber that affirms your personality”.  

It would be nice if there were more contextual content moderation. But again, context is not a manifest thing. It’s a social construct.

Dr. Daniel Klug, Senior Multimedia Production lecturer at the University of Applied Sciences of the Grisons

For Marzano, however, audience literacy is only effective if journalists themselves are literate and willing to report on the issue. She recalled a 2019 case in which German outlets discovered that LGBTQ+ hashtags and keywords were being actively restricted on TikTok. Although the platform blamed the issue on an error, it ultimately acknowledged the problem and revised its guidelines. 

There’s no definitive answer as to what comes next for algospeak. What’s clear is that we need to remain aware of its uses and benefits, but also of how it is reshaping the way we communicate and deliver journalism on social media. Algospeak can connect communities, but it can also isolate, polarize, and even make them invisible.  

Erika Marzano and Daniel Klug participated in the panel “Unaliving” language online: How modern journalism decodes “algospeak”, at iMEdD’s 2025 International Journalism Forum, along with Iris Pase, Founding Editor of Pillow Talk Scotland and David Maas, Senior Editorial Director of the International Journalists’ Network.  

Watch the full panel discussion here.