

Artist Trevor Paglen recently sought to define this state of automated looking that everyday people are now enveloped by. On TikTok, videos are uploaded to the platform in formats that are then ‘read’ by their artificial intelligence systems who, by using the information provided by the user, learn more about these content creators and their audiences by identifying textual and visual markers that may denote personally revealing demographics and feed this data into the platform’s recommendation systems. As Saifya Noble notes in her 2018 book Algorithms of Oppression: How Search Engines Reinforce Racism, “algorithmic oppression is not just a glitch in the system but, rather, is fundamental to the operating system of the web,” where IRL socioeconomic and political power structures of inequality are digitally reinforced. Videos with obviously fake blood and other stage props, videos where a user lights something like a candle with a lighter, videos where TikTokers are speaking openly (or joking) about experiences with mental health, death, or suicide, were getting taken down for being graphic, inciting violence, or (as it continued to be the case for fashion creators who didn’t fit the skinny, white, cisgender norm) promoting sexual content. After this, creators began noticing videos getting taken down or suppressed for relatively innocuous activities getting ‘misread’ by the algorithm. In July 2021, TikTok announced that it would begin using automated content moderation in a greater capacity to remove content identified as violations of its Community Guidelines, particularly with sexual content, minor safety, violent or graphic content, and illegal activities. Since then, anecdotal experiences with 'shadowbanning' have persisted and accusations of discriminatory content suppression tactics continue to be made against the app by non-white, queer, plus-size, and disabled creators.
#Download macforge series
In March 2020, The Intercept published a series of leaked internal documents from 2019 that showed instructions for moderators to identify undesirable content from users who appeared poor, had visible disabilities, or whose bodies had “ugly” or “abnormal” shapes. To understand how we got here, it’s important to look at how TikTok’s content moderation has evolved in tandem with the app’s meteoric rise in popularity over the past two years. This ecosystem of semi-censored language that renders content unreal and bodies alienated is a relatively new phenomenon, one that has sprung up in direct response to an ever-changing algorithmic influence which continues to shape how users interact with the app. Or rather, TikTok’s automated, AI-driven content moderation system. That third spectator, sitting between the content on the app and the audience who watches it, is TikTok itself. They are human bodies performing something for the camera, yet appearances and actions and accessories are labeled as artificial and unreal to assure some unseen third spectator. There’s nothing fake about what’s being shown on camera (with the exception of some very realistic cosplay weapons) creators’ bodies might be edited or filtered, but they’re still bodies. The videos might be clickbait-y thirst traps, cinematic cosplay, viral dances, or more mundane content like clothing try-on hauls, room tours, and wild stories recounted to the camera. Spend enough time on TikTok and you’ll start to notice patterns in the captions and video text you scroll past: phrases like “FAKE BODY,” tw: d*ath, “FAKE BODY DON’T DELETE !!,” unalive myself, PROP !! DON’T REPORT ME, FAKE BLOOD NOT REAL, tw: dr*gs, and FAKE KN*FE embedded within a smattering of emojis, hashtags, and tactically-placed asterisks. We encourage creators to celebrate what makes them unique and viewers to engage with what inspires them we believe that a safe environment helps everyone do so openly.” - TikTok Community Guidelines, as of February 2022 “At TikTok, we prioritize safety, diversity, inclusion, and authenticity.

Art by as we find things on the internet by following links from one place to another, language spreads and disseminates through our conversations and interactions.” - Gretchen McCulloch, Because Internet: Understanding the New Rules of Language, Riverhead Books, 2019
