This may be controversial, but I don’t care what kind of AI-generated images people create as long as it’s obvious they’re not reality. Where I worry is the creation of believable false narratives, from explicit deepfakes of real people to completely fictional newsworthy events.
I’ve read that pedophiles are more likely to act out on their urges if they have access to real images. I would guess that this also applies for ai generated images too, even if they don’t look 100% real, but I could be wrong on that. Whatever stops them from abusing kids is what I’m for.
I want to say research on the subject has been inconclusive overall. I’d certainly update my view given convincing evidence that fictional images lead to abuse of real children.
Of course, none of that has anything to do with the non-explicit video linked elsewhere in this thread of an adult woman using the toilet.
This may be controversial, but I don’t care what kind of AI-generated images people create as long as it’s obvious they’re not reality. Where I worry is the creation of believable false narratives, from explicit deepfakes of real people to completely fictional newsworthy events.
I’ve read that pedophiles are more likely to act out on their urges if they have access to real images. I would guess that this also applies for ai generated images too, even if they don’t look 100% real, but I could be wrong on that. Whatever stops them from abusing kids is what I’m for.
I want to say research on the subject has been inconclusive overall. I’d certainly update my view given convincing evidence that fictional images lead to abuse of real children.
Of course, none of that has anything to do with the non-explicit video linked elsewhere in this thread of an adult woman using the toilet.
I agree here. Im not worried about imaginary things except for their ability to appear like actual things and mess with truth.
deleted by creator
It’s not really CSAM if there is no abuse happening, is it?