Paedophiles are using a popular new artificial intelligence (AI) platform to transform real photos of children into sexualised images, it has been revealed.
It has led to warnings to parents to be careful about the pictures of their children they’re posting online.
The images were found on the US AI image generator Midjourney, which much like ChatGPT uses prompts to deliver an output, although these usually consist of pictures rather than words.
The platform is used by millions and has churned out such realistic images that people across the world have been fooled by them, including users on Twitter.
An image of Pope Francis donning a huge white puffer jacket with a cross hanging from his neck sent social media users into a frenzy earlier this year.
Investigation: Paedophiles are using the new artificial intelligence (AI) program Midjourney to transform real photos of children into sexualised images, it has been revealed.
There have also been fake images of Donald Trump’s arrest and ‘The Last Supper Selfie’ produced using the platform.
WHAT IS MIDJOURNEY AI? Midjourney is an online image generator which much like ChatGPT uses prompts to deliver an output. However, these outputs usually consist of pictures rather than words. The platform is used by millions and has churned out such realistic images that people across the world have been fooled by them, including users on Twitter. An image of Pope Francis donning a huge white puffer jacket with a cross hanging from his neck sent social media users into a frenzy earlier this year. There have also been fake images of Donald Trump’s arrest and ‘The Last Supper Selfie’ produced using the platform. Last year, the platform received a backlash when a computer-generated image won first place in a US art competition. The AI artwork, dubbed Théâtre D’opéra Spatial, was submitted by Jason Allen who said he used Midjourney to make the stunning scenes that appear to combine medieval times with a futuristic world. Advertisement
The program has recently released a new version of its software which has increased the photorealism of its images, only serving to increase its popularity further.
An investigation by the Times found that some Midjourney users are creating a large number of sexualised images of children, as well as women and celebrities.
Among these are explicit deepfake images of Jennifer Lawrence and Kim Kardashian.
Users go through the communication platform Discord to create prompts and then upload the resulting images on the Midjourney website in a public gallery.
Despite the company saying that content should be ‘PG-13 and family friendly’, it has also warned that because the technology is new it ‘does not always work as expected’.
Although virtual child sexual abuse images are not illegal in the US, in England and Wales content such as this – known as non-photographic imagery – is banned.
The NSPCC’s associate head of child safety online policy, Richard Collard, said: ‘It is completely unacceptable that Discord and Midjourney are actively facilitating the creation and hosting of degrading, abusive and sexualised depictions of children.
‘In some cases, this material would be illegal under UK law and by hosting child abuse content they are putting children at a very real risk of harm.’
He added: ‘It is incredibly distressing for parents and children to have their images stolen and adapted by offenders.
‘By only posting pictures to trusted contacts and managing their privacy settings, parents can lessen the risk of images being used in this way.
‘But ultimately tech companies must take responsibility for tackling the way their services are being used by offenders.’
In response to The Times’ findings, Midjourney said it would ban users who had broken its rules.
Its CEO and founder David Holz added: ‘Over the last few months, we’ve been working on a scalable AI moderator, which we began testing with the user base this last week.’
A spokesman for Discord told the Times: ‘Discord has a zero-tolerance policy for the promotion and sharing of non-consensual sexual materials, including sexual deepfakes and child sexual abuse material.’
Midjourney is churning out such realistic images that people are being fooled. An image of Pope Francis donning a huge white puffer jacket with a cross hanging from his neck sent social media users into a frenzy earlier this eyar
The AI was also used to show former US President Donald Trump being arrested in New York
It has generated images of historical figures taking a selfie during well-known events, such as The Last Supper
The discovery comes amid growing concern about paedophiles exploiting virtual reality environments.
Earlier this year an NSPCC investigation revealed for the first time how platforms such as the metaverse are being used to abuse children.
Data showed that UK police forces had recorded eight examples where virtual reality (VR) spaces were used for child sexual abuse image crimes.
The metaverse, which is being primarily driven by Meta’s Mark Zuckerberg, is a set of virtual spaces where you can game, work and communicate with others who aren’t in the same physical space as you.
The Facebook founder has been a leading voice on the concept, which is seen as the future of the internet and would blur the lines between the physical and digital.
West Midlands police recorded five instances of metaverse abuse and Warwickshire one, while Surrey police recorded two crimes — including one that involved Meta’s Oculus headset, which is now called the Quest.
GPT’s reaction to this article:
The NSPCC said the data showed there was ‘no doubt’ that paedophiles were using the metaverse to target children.
The organisation has now called on the government to bring in new laws to regulate the metaverse and keep children safe.
This article is about paedophiles using an AI platform to create sexualised images of children.