======================================================================
=                             Algospeak                              =
======================================================================

                            Introduction
======================================================================
In social media, algospeak is a self-censorship phenomenon in which
users adopt coded expressions to evade automated content moderation.
It allows users to discuss topics deemed sensitive to moderation
algorithms while avoiding penalties such as shadow banning,
downranking, or de-monetization of content. A type of netspeak,
algospeak primarily serves to bypass censorship, though it can also
reinforce group belonging, especially in marginalized communities.
Algospeak has been identified as one source of linguistic change in
the modern era, with some terms spreading into everyday offline speech
and writing.


                              History
======================================================================
The term 'algospeak'-a blend of 'Algorithm' and '-speak--'appear to
date back to 2021, though related ideas have existed for much longer;
for example, 'Voldemorting', referencing the fictional character also
known as "You-Know-Who" or "He-Who-Must-Not-Be-Named", refers to the
use of coded expressions to avoid giving attention to objectionable
figures or attracting algorithmic attention from unwanted audiences.

The term 'algospeak' gained wider recognition in 2022 after Taylor
Lorenz featured it in an article for 'The Washington Post'.

In 2025, Adam Aleksic published 'Algospeak', the first monograph
dedicated to the phenomenon. It proposes an expanded definition which
encompasses any language change that is primarily driven by the
constraints of digital platforms.


                       Causes and motivations
======================================================================
Many social media platforms rely on automated content moderation
systems to enforce their guidelines, which the users often have no
control over and may be changed at any time. TikTok in particular uses
artificial intelligence to proactively moderate content, in addition
to responding to user reports and using human moderators. In
colloquial usage, such systems are called "algorithms" or "bots".
TikTok has faced criticism for its unequal enforcement on topics such
as LGBTQ people and obesity, leading to a perception that social media
moderation is contradictory and inconsistent.

Between July and September 2024, TikTok reported removing 150 million
videos, 120 million of which were flagged by automated systems.
Automated moderation may miss important context; for example, benign
communities who aid people who struggle with self-harm, suicidal
thoughts, or past sexual violence may inadvertently receive
unwarranted penalties. TikTok users have used algospeak to discuss and
provide support to those who self-harm. An interview with nineteen
TikTok creators revealed that they felt TikTok's moderation lacked
contextual understanding, appeared random, was often inaccurate, and
exhibited bias against marginalized communities.

Algospeak is also used in communities promoting harmful behaviors.
Anti-vaccination Facebook groups began renaming themselves to “dance
party” or “dinner party” to avoid being flagged for misinformation.
Likewise, communities that encourage the eating disorder anorexia
nervosa have been employing algospeak. Euphemisms like "cheese pizza"
and "touch the ceiling" are used to refer to child pornography.

On TikTok, moderation decisions can result in consequences such as
account bans and deletion or delisting of videos from the main video
discovery page, called the "For You" page. In response, a TikTok
spokeswoman told 'The New York Times' that the users' fears are
misplaced, saying that many popular videos discuss sex-adjacent
topics.


                              Methods
======================================================================
Algospeak uses techniques akin to those used in Aesopian language to
conceal the intended meaning from automated content filters, while
being understandable to human readers. Other similar adoption of
obfuscated speech include Cockney rhyming slang and Polari, which
historically were used by London gangs and British gay men
respectively. However, unlike other forms of obfuscated speech, the
global reach of social media has allowed the language to spread beyond
local settings.

Techniques used in algospeak are extremely diverse. In orthography,
users may draw from leetspeak, where letters are replaced with
lookalike characters (e.g. $3X for 'sex'). Certain words or names may
be censored, or in the case of auditory media, cut off or bleeped,
e.g., 's*icide' instead of 'suicide'. Another method involves
"pseudo-substitution", where an item is censored in one form, while it
is present in another form at the same time, as seen in videos. Some
may involve intersemiotic translation, where non-linguistic signs are
interpreted linguistically, in addition to further obfuscation. For
example, the corn emoji "🌽" signifies pornography by means of
porn→corn→🌽. Others may rely on phonological similarity or variation,
such as 'homophobic'→'hydrophobic', and 'sexy'→'seggsy' via
intervocalic voicing. On Chinese social media, users sometimes replace
sensitive terms with characters that differ only in tone. For example,
细颈瓶 (xì jǐng píng, literally “narrow-necked bottle”) is used as a
stand-in for the name of General Secretary of the Chinese Communist
Party, Xí Jìnpíng.

In an interview study, most creators that were interviewed suspected
TikTok's automated moderation was scanning the audio as well, leading
them to also use algospeak terms in speech. Some also label sensitive
images with innocuous captions using algospeak, such as captioning a
scantily-dressed body as "fake body". The use of gestures and emojis
are common in algospeak, showing that it is not limited to written
communication. A notable example is the use of the watermelon emoji on
social media as a pro-Palestinian symbol in place of the Palestinian
flag in order to avoid censorship by Facebook and Instagram. Black
creators may simply present their light-colored palms to the camera to
stand in for white people, and flip them to stand in for black people.


                        Impact and detection
======================================================================
A 2022 poll showed that nearly a third of American social media users
reported using "emojis or alternative phrases" to subvert content
moderation.

Algospeak can lead to misunderstandings. A high-profile incident
occurred when American actress Julia Fox made a seemingly
unsympathetic comment on a TikTok post mentioning "mascara", not
knowing its obfuscated meaning of sexual assault. Fox later apologized
for her comment. In an interview study, creators shared that the
evolving nature of content moderation pressures them to constantly
innovate their use of algospeak, which makes them feel less authentic.

A 2024 study showed that GPT-4, a large language model, can often
identify and decipher algospeak, especially with example sentences.
Another study shows that sentiment analysis models often rate negative
comments incorporating simple letter-number substitution and
extraneous hyphenation more positively.


                              Examples
======================================================================
According to 'The New York Times':

* '(to) unalive, unalived' - to kill; killed, dead.
* 'accountant' - sex worker
* 'cornucopia' - homophobia
* 'le dollar bean' - lesbian, as derived from the written form Le$bian
* 'leg booty' - the LGBTQ+ community
* 'nip nops' - nipples
* 'panini', 'panoramic' - a pandemic, especially the COVID-19 pandemic
* 'seggs' - sex

Other examples:


* 'acoustic' - autistic
* 'blink in lio' - link in bio
* 'camping' - abortion
* 'cheese pizza' - child pornography
* 'fork' - fuck
* 'grape' - rape
*'Keep Yourself Safe' - recursive acronym for 'kill yourself'
* 'music festival' - protest
* 'opposite of love' - hatred
* 'ouid' - weed
* 'Panda Express' - pandemic
* 'PDF file', 'PDF' - pedophile
* 'pew pew' - firearm
* 'regarded' - retarded
* 'sewer slide' - suicide
* 'shmex' - sex
* 'tism' - autism spectrum conditions
* 'yt' - White people, though 'yt' is also a common abbreviation for
YouTube


                              See also
======================================================================
*
*
* Dog whistle (politics) - Political messaging using coded language
* Euphemisms for Internet censorship in China
* Internet censorship
*


                           External links
======================================================================
*
"[https://open.spotify.com/episode/67gUlcIdLrpSYj8uICV62U?go=1&sp_cid=6a617472a387d078d1676dd1c88c20e0
How Algorithms Are Changing the Way We Speak]", 'Creative
Control'--'Fast Company' podcast episode on Spotify


License
=========
All content on Gopherpedia comes from Wikipedia, and is licensed under CC-BY-SA
License URL: http://creativecommons.org/licenses/by-sa/3.0/
Original Article: http://en.wikipedia.org/wiki/Algospeak