OP 04 November, 2024 - 04:10 PM
A Briton who used artificial intelligence to create child abuse images was jailed for 18 years on Monday.
The court sentenced Hugh Nelson, 27, after he admitted a string of sex offences, including making and distributing indecent images of children and distributing indecent pseudo-photographs of children. He also admitted encouraging the rape of a child.
Nelson took orders from people in online chat rooms for custom-made explicit images of children being harmed both sexually and physically.
Police in Manchester, in northern England, said he used AI software from US company Daz 3D, which has an AI function, to create the images, which he sold to online buyers and gave away for free. The force said it was a precedent-setting case for its online child abuse investigation team.
The company said the licence agreement for its 3D rendering software Daz Studio prohibited it from being used to create images that "breach child pornography or child sexual exploitation laws or are otherwise harmful to minors". "
We condemn the misuse of any software, including ours, for such purposes and we are committed to continually improving our ability to prevent it," Daz 3D said in a statement, adding that its policy was to cooperate with law enforcement where necessary.
Bolton Crown Court, near Manchester, heard that Nelson, who has a master's degree in graphics, also used images of real children for some of his computer work.
Judge Martin Walsh said it was impossible to determine whether a child had been sexually abused because of his images, but Nelson intended to encourage others to rape a child and had no idea how his images would be used.
Nelson, who has no criminal record, was arrested last year. He told police he found like-minded people online and eventually began creating images to sell.
Prosecutor Jeanette Smith said outside court it was deeply disturbing that Nelson could take ordinary photographs of children and, using AI tools and computer software, transform them and create the most depraved images to sell and distribute online.
Prosecutors said the case arose from an investigation into AI and child sexual exploitation, while police said it was a test of current legislation because the use of computer software in the way Nelson did is so new that it is not specifically mentioned in current UK law.
The case is reminiscent of similar efforts by U.S. law enforcement to combat the alarming spread of child sexual abuse images created using artificial intelligence technology — from manipulated photos of real children to graphic images of virtual children. The Justice Department recently filed what is believed to be the first federal case involving solely AI-generated images — meaning the children depicted are not real but virtual.
The court sentenced Hugh Nelson, 27, after he admitted a string of sex offences, including making and distributing indecent images of children and distributing indecent pseudo-photographs of children. He also admitted encouraging the rape of a child.
Nelson took orders from people in online chat rooms for custom-made explicit images of children being harmed both sexually and physically.
Police in Manchester, in northern England, said he used AI software from US company Daz 3D, which has an AI function, to create the images, which he sold to online buyers and gave away for free. The force said it was a precedent-setting case for its online child abuse investigation team.
The company said the licence agreement for its 3D rendering software Daz Studio prohibited it from being used to create images that "breach child pornography or child sexual exploitation laws or are otherwise harmful to minors". "
We condemn the misuse of any software, including ours, for such purposes and we are committed to continually improving our ability to prevent it," Daz 3D said in a statement, adding that its policy was to cooperate with law enforcement where necessary.
Bolton Crown Court, near Manchester, heard that Nelson, who has a master's degree in graphics, also used images of real children for some of his computer work.
Judge Martin Walsh said it was impossible to determine whether a child had been sexually abused because of his images, but Nelson intended to encourage others to rape a child and had no idea how his images would be used.
Nelson, who has no criminal record, was arrested last year. He told police he found like-minded people online and eventually began creating images to sell.
Prosecutor Jeanette Smith said outside court it was deeply disturbing that Nelson could take ordinary photographs of children and, using AI tools and computer software, transform them and create the most depraved images to sell and distribute online.
Prosecutors said the case arose from an investigation into AI and child sexual exploitation, while police said it was a test of current legislation because the use of computer software in the way Nelson did is so new that it is not specifically mentioned in current UK law.
The case is reminiscent of similar efforts by U.S. law enforcement to combat the alarming spread of child sexual abuse images created using artificial intelligence technology — from manipulated photos of real children to graphic images of virtual children. The Justice Department recently filed what is believed to be the first federal case involving solely AI-generated images — meaning the children depicted are not real but virtual.