In the last couple of years, adult content platforms have faced the spread of user-generated synthetic visual sex media created by generative AI algorithms. This type of content is often referred to as the controversial ‘deepfake pornography.’ As a response, even before its release, OpenAI had announced that its new text-to-video model generation Sora will not allow users to generate sexual content with the technology to prevent abuse, as measures are in place to ‘reject text inputs requesting videos that contain … sexual content … as part of its safety measures and precautions.’ (OpenAI, February 2024). Likewise, back in December 2023, a streaming platform, Twitch, straightened its policies over broadcasts of illustrated sexual content because ‘AI can be used to create realistic images’ (Twitch, 2023).
But while hyper-realistic deepfake porn videos have received major news coverage for the scale and repercussions of their harms (and due to identifiable famous victims), discussions of synthetic visual sex media overlook the use of deep-learning algorithms in the production of adult animation. While every artist must now negotiate with data-collecting risks, since their work can be non-consensually used to train generative AI algorithms, sex media brings new concerns to this conversation. Specifically, they relate to the platformed policing under the form of moderation measures (what is acceptable, what is not) of synthetic visual sex media that look like non-AI animation.
This type of AI-generated sex media does not necessarily aim to present hyper-realistic depictions of living or fictional people, which can be illustrated by the popularity of AI ‘anime’ art on the aforementioned platforms. If the question of deepfakes has occupied the ‘AI and sex’ conversation, it is necessary for porn platforms to go beyond it. The use of AI to produce animated porn that is indistinguishable from non-AI animation means that porn platforms must now thread around this reality in their policies. The challenge is to avoid creating an ‘AI porn’ umbrella term based on the existing and emerging live-action frameworks. The spread of ‘deepfake pornography,’ however, cannot be simply reduced by platform moderators as just a form of ‘non-live-action porn.’ For example, we do not have access to performer’s biometric data which means that their age or consent status cannot be verified. Another issue is that such broad label could also rope in non-abusive computer-generated porn produced by animation workers. Often, these artists cannot provide these data for fictional characters, or their fanart content has a long history of representing existing celebrities. Because platforms hosting adult animated content have to work around both hyper-realistic and non-realistic AI productions, their policies must take into consideration that both computer-generated animation and synthetic visual sex media exist on a spectrum of manufactured pornography.
Media governance practices imposed upon online porn content by legislators, policymakers and moderators often inform policies later applied to the rest of the internet. Therefore, looking into how adult animation platforms articulate definitions of the AI art provides us with a productive ground for thinking through future negotiations of generative art within the adult content industry at large. By essence, these animation-based platforms cannot simply create a dichotomy of moderating hyper-realistic versus non-realistic productions, hence they must produce more sophisticated policies for synthetic visual sex media.
Here, we can take the example of the platform Slushe. As a platform for sharing 2D and 3D adult art, Slushe allows content creators to upload pornographic animation, with the platform’s goal ‘to be as unrestrictive as possible.’ (Slushe, Terms of Use) As most porn platforms, Slushe does, however, draw the line at such themes as child/underage pornography and extreme violence/gore. Furthermore, Slushe does not allow content not created by the uploader, at the exception of the artists’ official representatives or collaborative partners. This measure helps to protect individual artists and credit them. Building on the question of ownership, in March 2023, Slushe updated its policies to provide clarifications about AI-generated art on the platform:
‘AI generated art is allowed on Slushe if the content complies with our standard content submission rules. Please note however that Slushe was originally intended to be a platform for artists who create original adult content, either from scratch (3D modeling, hand drawing, etc.) and by developing raw assets into an original character or scene with a unique style (using software like Daz Studio or Blender). While we recognize that AI tool users must customize parameters and judge the end result of the generator for artistic value, the uniqueness of the art is restricted by the parameters and source material of the AI tool. We wish to be respectful to the work 2D and 3D artists put into honing their skills and creating their art.’ (Slushe, Terms of Use)
This means that to be posted on Slushe, AI-generated art (i) must be categorized and tagged by users as such during the upload, (ii) cannot be featured on Slushe’s Homepage, and (iii) cannot be featured as Media Highlight in Slushe’s official blog posts or social media posts. Slushe also forbids any artwork that depicts real people (in appearance, name, or any other identifiable attributes), whether AI-generated or not. By acknowledging that the labor involved in making creative sex productions is what should distinguish AI art from the rest, a platform like Slushe offers creators boundaries and tools to allow synthetic and non-synthetic visual sex media to co-exist.
Thinking through the multiple forms under which computer-generated porn animation can exist, from 3D modeling to non-realistic synthetic visual sex media offers a useful starting point for mainstream porn platforms to articulate new policies over AI that will avoid unfairly targeting animated porn content creators.
This research is supported by a Doctoral Fellowship in AI and Inclusion from the University of Ottawa AI + Society Initiative.
Aurélie Petit is a PhD Candidate in the Film Studies department at Concordia University, Montréal. She specializes in the intersection of technology and animation, with a focus on gender and sexuality. Her thesis examines the role that U.S.-based Japanese animation online communities played in shaping contemporary sociotechnical uses of social media, and in particular exclusionary practices towards women users. During the Summer 2023, she was a PhD Intern at Microsoft Research where she worked on the limits of applying live-action governance frameworks to animated pornographic media. She is currently a Doctoral Fellow in AI and Inclusion at the AI + Society Initiative (University of Ottawa), collaborating with Professor Jason Millar and the CRAiEDL on the ethics of deepfake pornography.