• Madison420@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

    • jeremyparker@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      This isn’t true. AI can generate tan people if you show them the color tan and a pale person – or green people or purple people. That’s all ai does, whether it’s image or text generation – it can create things it hasn’t seen by smooshing together things it has seen.

      And this is proven by reality: ai CAN generate csam, but it’s trained on that huge image database, which is constantly scanned for illegal content.