• hikaru755@feddit.de
    link
    fedilink
    arrow-up
    5
    ·
    6 months ago

    There are legit, non-CSAM types of images that would still make these changes apparent, though. Not every picture of a naked child is CSAM. Family photos from the beach, photos in biology textbooks, even comic-style illustrated children’s books will allow inferences about what real humans look like. So no, I don’t think that an image generation model has to be trained on any CSAM in order to be able to produce convincing CSAM.

    • xmunk@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      This is a fair point - if we allow a model to be trained on non-sexualizing minor nudity it likely could sexualize those models without actually requiring sexualized minors to do so. I’m still not certain if that’s a good thing, but I do agree with you.

      • hikaru755@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Yeah, it certainly still feels icky, especially since a lot of those materials in all likelihood will still have ended up in the model without the original photo subjects knowing about it or consenting. But that’s at least much better than having a model straight up trained on CSAM, and at least hypothetically, there is a way to make this process entirely “clean”.