Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Zaktor@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 hours ago

    Absolutely agree. My comment above was focused on whether some minimal amount of CSEM would itself make similar images happen when just prompting for porn, but there are a few mechanics that likely bias a model to creating young-looking faces in porn and with intentional prompt crafting I have no doubt you can at least get an approximation of it.

    I’m glad to hear about the models that are intentionally separating adult content from children. That’s a good idea. There’s not really much reason an adult-focused model needs to be mixed with much other data. There’s already so much porn out there. Maybe if you want to tune something unrelated to the naked parts (like the background) or you want some mundane activity, but naked, but neither of those things need kids in them.