
AI for child pornography: can it be morally acceptable?
The website of 28-year-old Rasmus* from Denmark contains thousands and thousands of images of children, mainly girls. Yet, there may not be any victims, because Rasmus produced all the images himself using artificial intelligence (AI). An international case that has also led investigators to Switzerland.
Rasmus* looks at the pictures on his computer with satisfaction. There are many, now well over 300,000. He has had to practise, but now Rasmus has mastered the AI program, most of which works for free, so much so that no one can tell whether the photos of the girls and boys in provocative poses are real or not. A win-win situation for him: there are no victims, and yet he is able to offer thousands of images for sale.
In June 2024, the annual meeting of paedophile crime experts is held at Europol’s headquarters in The Hague. The UK and Denmark make a joint presentation of the case of the young Dane who distributes adverts for explicit pornographic content via TikTok, YouTube, X, Discord and his own website. Most of the publicly viewable content is not illegal: although the girls in the photos look young, the content is not explicitly pornographic. For five euros a month, though, Rasmus is offering premium content, up to a thousand images per month. The UK National Crime Agency, which came across it during its undercover investigations, pays to join up. The exclusive content turns out to be explicit child pornography. Thanks to the purchase, the police are able to identify the perpetrator: Rasmus is arrested at his home in Denmark.
An international case that also leads investigators to Switzerland
The police seize over 300,000 images, all AI-generated; around 30,000 of them are child pornography. The customers, almost 300 of them, live in over 30 countries worldwide. In autumn, Europol notifies fedpol that three live in Switzerland, and in turn fedpol informs the local cantonal police. The first house search takes place in the canton of Basellandschaft, followed by Lucerne and the city of Zurich. One person arrested argues that he did not know that the possession of AI-generated child pornography was a criminal offence.
Are the perpetrators really exploiting a loophole in the law? Not according to the Swiss Criminal Code: its Article 197 clearly states that the production and possession of pornographic material that does not contain images of real people is a criminal offence. It doesn’t matter whether it’s a drawing, photo or AI-generated image; in every case this is a criminal offence.
AI distracts from the real victims
AI-generated child pornography poses major challenges for the police. The amount of material has been increasing exponentially for years and AI programs are becoming more accessible and user-friendly. Even if no one is exploited to produce AI-generated child pornography, it distracts from the genuine victims of sexual abuse and makes the work of the police more difficult, because there are still many images that are real, and the effort required to differentiate between real and AI-generated images is becoming increasingly complex and time-consuming. Behind every real picture is a victim, an abused child who needs to be found. This makes cooperation and the rapid exchange of information all the more important, whether between fedpol and international partners such as Europol, or at national level. Because the victims have top priority – at all times.
‘In the case of child pornography, identifying the victims is particularly important, because the victims usually lead us to other perpetrators. The circle gets wider and wider. With AI-generated material, however, we often don’t even know whether there is a victim at all. That makes our work more difficult.’
Marcel, federal investigator
* Name changed