Artificial-intelligence-generated, sexually explicit images of music sensation Taylor Swift made their rounds on X, formerly known as Twitter, last week before they were removed from the platform — as much as an image can be “removed” from the internet, that is. The resulting outrage called out the ethical reprehensibility of deepfake porn and raised questions of why the images became so popular.
Deepfake technology effectively maps any person’s face onto a model given to the artificial intelligence. It has its roots in the late 1990s, but only really entered the public eye in the mid-2010s, courtesy of — who could’ve guessed it — Reddit pornography. The subreddit r/deepfakes, which amassed 90,000 subscribers before getting banned, allowed users to share pornographic videos with popular celebrities’ faces pasted on them.
Needless to say, deepfakes didn’t have a very positive introduction to the general public. But a few years later, they came back on the emerging platform TikTok.
By early 2021, the youth of America had spent a year inside and yearned for new media to consume. Right around when the people of the internet lost their collective sanity, deepfaked videos reared their ugly head once more. This time, they felt much sillier, featuring performances of Presidents Barack Obama, Donald Trump and Abraham Lincoln singing Dragostea Din Tei by O-Zone. The internet welcomed them with open arms.
So why do we consider these 2021-era TikTok deepfakes morally okay? The simple reason is that they are so obviously fake. Obama’s teeth visibly enlarge when he opens his mouth, the flag behind Trump moves with his head and Lincoln famously preferred O-Zone’s work in the ‘90s to their 21st century releases. The idea that any of them are actually singing is laughable.
We often justify using celebrities’ likenesses in AI-generated videos or voice dubs because no reasonable person would believe they’re real. But in the case of deepfake pornography, this means of justification becomes the reason its consumers love it so much.
Nobody would ever believe the images of Taylor Swift were real — AI technology still fortunately lacks the skill to create a truly realistic depiction of a naked body. The people who circulate and drool over these pictures actually love that they’re fake because it chips away even further at the little agency women have over their bodies.
Nobody treats the subjects of porn worse than the people who consume it most. If you want a man to quit porn, you’ll have to tell him that he might get erectile dysfunction because concerns over the mistreatment of women will fall on deaf ears. Studio pornography objectifies women and fetishizes sexual violence — a porn addict might even like it more if you tell him about how the industry operates.
AI-generated porn works the same way. The people who create, share and save these photos don’t like them just because they’re of Taylor Swift — they like them because Swift can’t control them. The “fakeness” of the images — the quality that typically calms our nerves over AI — instead has become a visible indicator that the person depicted in the photos did not consent to being in them, which is more appealing to their audience than if they were real.
Deepfake pornography has ruined lives for years and will continue to do so until legislation is brought forth to oversee and limit its usage, but laws alone won’t change the bigoted culture that surrounds the consumption of porn. The only way to stop a repeat of the last week — and the last decade for countless women — is to call out the too-many men who fetishize the theft of agency from the women they want to sleep with.