- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
deleted by creator
deleted by creator
deleted by creator
The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.
Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.
Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.
Creepy isn’t illegal. Never has been.
I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.
The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.
deleted by creator
I worry that the cat is out of the bag on this. The tech for this stuff is out there, and you can run it on your home computer, so barring some sort of massive governmental overreach I don’t see a way to stop it.
They can’t even stop piracy and there’s the full weight of the US copyright industry behind it. How are they going to stop this tech?
The point isnt that its too late, its that the hype is overblown. Go to the website this article mentions and follow the instructions (explore, turn on nsfw, type name and nude) and you will VERY QUICKLY realize the technology is just shit.
You can see some resemblances and sometimes one close one, but like another poster said they just look like the same shitty fan fiction we’ve had since Photoshop came about.
Also, this could end porn blackmail when none can tell if its real or not. People will start judging the person who us supplying the material rather than the person in it.
If it ends porn blackmail, it also ends photographic evidence. I think that’s significantly worse.
And sure the tech is bad if you just type a name directly into a model, but if you take the time to refine it it gets pretty good, and it’s only going to get better over time. It’s time to start thinking about a future where this tech exists.
deleted by creator
Everybody gets horny, idiot.
Please don’t call people idiots needlessly.
Does it matter if someone jerks off to JaLo in the Fappening or some random AI generated BS?
The issue is that this technology can be used to create pornographic material of anyone that has some level of realism without their consent. For creators and the average person, this is incredibly harmful. I don’t want porn of myself to be made and neither do a lot of creators online.
Not only are these images an affront to the dignity of people but it can also be incredibly harmful for someone to see porn of themselves they did not make with someone else’s body.
This is a matter of human decency and consent. It is not negotiable.
As mentioned by @ram@lemmy.ca, this can also be used for other harmful things like CSAM which is genuinely terrifying.
I have to disagree (but won’t downvote!)
AI porn is creepy. In multiple ways!
But it’s also a natural evolution of what we’ve been doing as a species since before we were a species.
Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.
How about if somebody draws a crude stick figure of somebody they met on the street? Unless you’re Randall Munroe, this is probably harmless too.
Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.
Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.
Or, and this happened to me quite recently, you find your porn doppelganger. My spouse found mine and it ruined her alone time. And they really did look just like me! Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?
Like you, I don’t want people like this in my life. But it feels like this is one of those slippery slopes that turns out to be an actual slippery slope.
You can’t make it illegal without some serious downstream effects.
If you did, the servers will just get hosted in an Eastern European country that is happy to lulwat at American warrants.
I don’t have any answers, just more Devil’s advocate-esque questions. If there was a way to make it illegal without any collateral damage, I’d be proudly behind you leading the charge. I just can’t imagine a situation where it wouldn’t get abused, a’la the DMCA.
I’m up voting just because I’m now wondering what xkcd fanservice would look like
Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.
You can’t share that though so while I still think it is immoral, it is also kind of impossible to know.
Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.
Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.
Those would be immoral and reprehensible. The law already protects against such cases on the basis of using someone’s likeness.
It’s harmful because it shares images of someone doing things they would never do. It’s not caricature, it’s simply a fabrication. It doesn’t provide criticism - it is simply erotic.
Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?
If the goal is to look like you, I would imagine it is possible to defend by law. Otherwise, it is simply coincidence. There’s no intent there.
I don’t think it is a stretch or slippery slope. Just as a picture is captured by a camera, a drawing is captured by a person or a machine.
Both should be the same and it is often already the case in many jurisdictions around the world when it comes to CSAM.
All of your arguments assume profit is the motive. Are you saying as long as no profit is made that it would be okay to do all of these things? (Ex. Self use only)
No. I think that it would still be bad if it were self-use because it is ultimately doing something that someone doesn’t consent to.
If you were to use this on yourself or someone consenting, I see no issues there - be kinky all you want.
Consent is the core foundation for me.
The reason why imagining someone is different is that it is often less intentional - thoughts are not actions.
Drawing someone to be similar to someone you know is very intentional. Even worse, there is a high likely chance that if you are drawing someone you know naked, you likely never asked for their consent because you know you wouldn’t get it.
How is ai pedophile stuff worse than actual pedophile stuff?
How is ai pedophile stuff worse than actual pedophile stuff?
It’s not worse - it’s just as bad.
I’m sorry in one of these scenario children are not being raped. How in the name of FUCK is just as bad?
That person just can’t grapple with any nuance, as they are afraid to let the sentence “ai child porn is less bad” cone out of their mouths
I don’t like grading evil for this very reason so I think I will refrain from doing so - thank you for catching me doing that. I will refrain from doing that.
That said, AI CSAM could enable other forms of abuse through blackmail. I can also see very harmful things happening to a child or teenager because people may share this material in a targeted way.
I think both are inhumane and disgusting.
I mean maybe calling it evil is part of the problem ?
There are degrees in everything. Punching somebody is less bad than killing somebody.
The number of victims matters.
Btw its totally humane because we invented the shit.
I mean maybe calling it evil is part of the problem ?
I call it evil because it is intentional and premeditated.
There are degrees in everything. Punching somebody is less bad than killing somebody.
Trying to put everything on degrees is bound to show ignorance and imply that certain things are more acceptable than others.
I don’t want to hurt people with my ignorance and I do not want to tell someone that what they experienced is less bad than something else. They are bad and we’ll leave it at that.
Btw its totally humane because we invented the shit.
I am working with this definition : “Characterized by kindness, mercy, or compassion”. There is a difference between human-made and humane.
You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn’t consent to, that’s a huge problem.
Both for the people whose images were used to train the model and for the people whose images are generated using the models.
Non-consent is non-consent.
This is how you get the feds involved.
Let’s not forget that these AI aren’t limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.
Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.
On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.
Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.
Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?
Neither. I would have mental health supports that are accessible to them.
Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.
Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.
It’s (rightfully) currently illegal, but that doesn’t stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.
Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.
I’m not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don’t have data to back that).
I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I’d much rather help people be productive, non-violent members of society than lock them up, if given a choice.
That’s a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I’d say we should do research on it at least. Even if it’s controversial, we need to look at the rationale behind it.
As someone who personally wouldn’t care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn’t like me may have the option to generate AI porn of me having sex with a child. Now there’s fake “proof” I’m a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I’m vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say “Evergreen5970 is promiscuous, don’t hire them.” Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I’m a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn’t do.
Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.
And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn’t detect AI-generated images with a perfect accuracy rate. So the question becomes “how can we trust any image anymore?” Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there’ll probably always be some floating around with those guardrails turned off.
I’m also very wary of dismissing other peoples’ discomfort just because I don’t share it. I’m still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.
deleted by creator
This doesn’t even feel like an article - more like one long advertisement. The second paragraph of the article launches into a review of the “Erect Horse Penis - Concept LoRA”
Which I had not heard of before, so it was even insightful.
The tech isn’t there yet. There are so often distracting flaws around the hands/feet. The AI doesn’t really know what a human is, its just endlessly re-combining existing material.
As much as I loathe having to reveal this to you, the shapeliness of the hands should be semi-negligible to most people who would love to have an image created from the statement “I want to see Billie Eilish’s boobs”.
Agree that was a strange take. Can you usually tell it’s AI/fake? Yes. Is it still achieving the goal of the creator/user? Yes.
You underestimate how important the feet are to many.
I’m not into feet specifically, but when I ask for “Veronica Mars in a string bikini” I don’t want to get “Veronica Mars with unattached toes.” It’s distracting AF.
Doesn’t happen with real models, or even human-made hentai.
deleted by creator
I’d be surprised if in a year or two AI still struggles with hands and feet
Rob Leifield could barely draw hands/feet and managed a successful career as a comic book artist.
It already doesn’t if you take the time to use tools like LORAS, Controlnet, and Inpainting to guide the output.
So about the same amount of work to Photoshop an celebs head onto a naked body.
Got it. I’m terrified of all the poorly made fanfic. Even perfectly made fanfic will never have the effect of a real photo or irl
Just because the ai produces a model with Billie eilishes face and a naked body does not mean you’ve seen her nude for real.
Its the exact same as drawing a naked lady and then drawing Billie eilish’s face on it.
If that really gets you off and really violates her autonomy in some way I’d be interested to hear how. Its not currently illegal to draw real people in fictional scenarios is it?
Implicit in this statement is that people who’re inclined to generate a visual simulacrum of a real person for fantasy purposes actually care if it’s real. By definition, it’s not. If “real” was an issue to them, they probably wouldn’t bother with it.
Key word is yet.
Yeah some body parts are a little weird today, but what about tomorrow, next week, next month, next year?
I really haven’t given this much attention but the last time I did maybe 6-8 months ago, most of the photos had hands that were stuff of nightmares. Looking at them again today at least from the quick 10 minutes of looking they have improve significantly. Yeah they are still far from perfect but a handful are very good, most are passable and a few are still nightmare fuel.
Just discovered 404media and am looking forward to how they perform in the space.
From a cursory scan, I did appreciate their deep dive and investigative approach in the article. I’ll be looking for their articles in the future.
Furry transformation and ass expansion, can you do that for me HAL69000?
Actually make him turn into a donkey girl before expanding dat butt, so we can make ass iokes