jeffw@lemmy.worldM to News@lemmy.world · 6 months ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square207fedilinkarrow-up1285arrow-down19
arrow-up1276arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 6 months agomessage-square207fedilink
minus-squareGrandwolf319@sh.itjust.workslinkfedilinkarrow-up2·6 months agoSo it’s all good as long as they have elf ears or that counts as realistic too?
minus-squarericecake@sh.itjust.workslinkfedilinkarrow-up5·6 months agoTwo things: please don’t generate child like pornography. Legal or not it’s disturbing and gross to even think about. Yes, per the law it must be “virtually indistinguishable”. “the term ‘indistinguishable’ used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.”. If it looks like a “real” elf and not a child wearing an elf costume it would be fine. So long as an ordinary person would know that it’s not a real child being abused, or a real child being depicted (placing a real child’s face on a compromising photo), it’s protected, albeit extremely unpleasant, speech.
So it’s all good as long as they have elf ears or that counts as realistic too?
Two things:
So long as an ordinary person would know that it’s not a real child being abused, or a real child being depicted (placing a real child’s face on a compromising photo), it’s protected, albeit extremely unpleasant, speech.