Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school “believed” the deepfake nudes were deleted.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    73
    ·
    edit-2
    1 year ago

    The other comment about how this has been happening for a long time (with low tech methods) is true, and it’s also true that we can’t stop this completely. We can still respond to it:

    An immediate and easy focus would be on what they do with the images. Sharing them around is still harassment / bullying and it should be dealt with in the same way as it currently is.

    There’s also an education aspect to it. In the past, those images (magazines, photocopies, photoshop) would be limited in who sees them. The kids now are likely using free online tools that aren’t private or secure, and those images could stick around forever. So it could be good to highlight that

    • Your friends and classmates may see them, and it may harm their lives. The images will likely stick around. Facial recognition algorithms are also improving, so it’s a legitimate concern that an image stored on a random site somewhere will be tied back to them.
    • The images can be traced back to the creator and the creator can face repercussions for it (for those without empathy, this might be the better selling point
    • Adalast@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      1 year ago

      To your first point, much to the benefit of humanity, and counter to popular belief, the internet is NOT forever. Between link rot, data purges, corporate buyouts, transmission compression losses, and general human stupidity, large swaths of the internet have vanished. Hell, just Macromedia selling out to Adobe ended up causing the loss of most of the popular internet games and videos for anyone in their mid to late 30s at this point (you will be misses Flash). The odds of these specific AI-generated child porn pictures surviving even in some dark corner of the bright web are slim to none. And if they end up surviving in the dark web, well, anyone who sees them will likely have a LOT of explaining to do.

      Also, for the commentary of the websites keeping the images. That is doubtful, beyond holding them in an account-bound locker for the user to retrieve. They don’t care and too many images get generated every day for them to see it as more than reinforcement training.

      Speaking of reinforcement training, they may have been able to use Photoshop’s new generative fill to do this, but to actually generate fresh images of a specific peer they would have had to train a LoRA or Hypernerwork on photos of the girl so the SD could actually resolve it. They weren’t doing that on an AI site, especially not a free one. They were probably using ComfyUI or Automatic1111 (I use both myself). They are free, open source, locally executed software that allow you to use the aforementioned tools when generating. That means that the images were restricted to their local machine, then transferred to a cell phone and distributed to friends.

      https://www.theatlantic.com/technology/archive/2021/06/the-internet-is-a-collective-hallucination/619320/

    • cy_narrator@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      7
      ·
      1 year ago

      I think we should pressure EU to make it such that any online AI photo generating website also uses AI to make sure what was asked is not illegal.