“I am sold against my will. I did not consent to being sexualized.” The reflection, hard, emphatic, is not from a victim of trafficking, nor from revenge porn, nor from a hack with theft of intimate photos included. Its author is Sweet Anita, a young English streamer with 1.9 million followers on Twitch. And what she denounced in this way in February —during a chat with NBC— is the practice that has somehow elevated her to a new category of victim: that of those who by the work and art of AI find themselves “starring” in content porn without your consent.
Anita’s is not a unique case.
On the contrary. It shows how, more and more, deepfakes are serving to unleash the darkest fetishes on the Internet… Both those starring famous singers, actresses or streamers, as well as anonymous people.
A celebrity-only problem? The thing about deepfakes and porn is not something new. The problem dates back a few years and some celebrities, such as Scarlet Johansson, Emma Watson or Gal Gadot have been dealing with it for some time. So much so that the Black Widow interpreter has come to consider “trying to protect herself from the depravity of the Internet” as a “lost cause”. As AI has evolved, offering more sophisticated and accessible tools capable of generating more realistic pieces, deepfakes have ceased to be an exclusive threat to famous singers and Hollywood stars.
The experience of the streamers. The effect of deepfakes has also been suffered by streamers such as Aweet Anita or QTCinderella, known for their content on video games and confectionery, but who ended up “starring” in a sex video generated with the help of artificial intelligence.
“Seeing yourself naked against your will and broadcast on the Internet is like feeling violated,” she confessed, thus joining the complaint of other celebrities, such as Helen Mort. The use of deepfakes has come to star in a heated controversy in the United Kingdom after another streamer acknowledged having paid for content on a page that showed videos of various colleagues.
Increasingly simple and accessible. That is the key, as pointed out by NBC News, which has carried out a study to verify how easy it is to access deepfakes. And “easy” is extensible both to porn videos, which can be accessed more or less easily with a simple Google search, and to the authors of the pieces themselves, who use the Discord platform to announce the material sale… Or services for personalized creations.
As advances in AI have made it easier and more accessible to use, generating sexual deepfakes has also become more lucrative. For example, NBC located a person who offered himself through Discord to create five-minute deepfakes starring a personal girl, anyone with less than two million followers on Instagram. The fee: $65. “There are more and more people in the crosshairs. We will hear many more victims who are ordinary people,” Noelle Martin, a lawyer, explains to the British chain.
Pulling the card… and subscriptions. The problem of deepfakes is serious enough to have led to an “economy” of its own, with specialized websites where short videos are shared as a hook for users to check out if they want to see the extended versions, subscription or payments with Visa, Mastercard or cryptocurrencies. In addition to the websites, there are creators who agree to prepare videos with a “personal girl”.
After NBC inquired about a deepfake author’s chat room, Discord took down the server for violating its content rules. The company ensures that it expressly prohibits “the promotion or exchange of non-consensual deepfakes”. In Change there is also a campaign that has already achieved 52,600 signatures to close one of the most popular pages, MrDeepFakes, created in 2018, and the rest of the “websites dedicated to sexual abuse based on images”.
Twitch streamer Atrioc issued an apology after accidentally revealing that he watched deepfake videos of streamers Pokimane and Maya pic.twitter.com/oEjHnGS3Ax
— Dexerto (@Dexerto) January 30, 2023
Some figures for context. In the summer of 2019 Deeptrace took out the calculator and analyzed how the volume of deepfakes circulating online had evolved. His conclusion: they were twice as many as at the end of 2018. The most surprising thing was not, however, that “boom”, but its engine: the vast majority of the videos, an overwhelming 96%, were not political pieces, but pornographic, many of them they “carried out” by famous actresses or singers.
It is not the only striking data on the table. An investigation by Genevieve Oh shows that last year the volume of videos uploaded to a popular deepfake page almost multiplied by seven that of 2018 —from 1,900 to more than 13,000—, which made it easier for it to exceed 16 million monthly visits. The increasing use of doctored videos has already caught the attention of lawmakers.
AI and porn, beyond deepfake. The relationship between artificial intelligence and porn goes beyond deepfakes. One of the best examples is Unstable Diffusion, which acts as a forum around AI systems built to generate adult visual content. Or, as defined on Discord, “a server dedicated to the creation and sharing of NSFW [Not safe for work] generated by AI” and that hosts content from different categories, which includes everything from porn for men or women to hentai, furry or BDSM pieces, among others.
One of the peculiarities of Unstable Diffusion is that it has shown the potential business that this type of service can represent with AI. By the end of 2022, when Kickstater decided to shut down its crowdfunding campaign, its backers had raised more than $56,000 from 867 backers.
In Xataka: DeepNude: the controversial application that “undresses” any woman using artificial intelligence