Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.
Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.