One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images
I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.
So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.
Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.
This isn’t true. AI can generate tan people if you show them the color tan and a pale person – or green people or purple people. That’s all ai does, whether it’s image or text generation – it can create things it hasn’t seen by smooshing together things it has seen.
And this is proven by reality: ai CAN generate csam, but it’s trained on that huge image database, which is constantly scanned for illegal content.
It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more
It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.
Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.
Similarly, you didn’t actually offer a counterpoint to any of my points.
I’m a professional artist and have no issue banning ai generated CSAM. People can call it self expression if they want, but that doesn’t change the real world consequences of it.
Allowing ai generated CSAM basically creates camouflage for real CSAM. As ai gets more advanced it will become harder to tell the difference. The scum making real CSAM will be emboldened to make even more because they can hide it amongst the increasing amounts of ai generated versions, or simply tag it as AI generated. Now authorities will have to sift through all of it trying to decipher what’s artifical and what isn’t.
The liklihood of them being able to identify, trace, and convict child abusers will become even more difficult as more and more of that material is generated and uploaded to various sites with real CSAM mixed in.
Even with hyper realistic paintings you can still tell it’s a painting. Anime loli stuff can never be mistaken for real CSAM. Do I find that sort of art distasteful? Yep. But it’s not creating an environment where real abusers can distribute CSAM and have a higher possibility of getting away with it.
I guess my question is, why would anyone continue to “consume” – or create – real csam? If fake and real are both illegal, but one involves minimal risk and 0 children, the only reason to create real csam is for the cruelty – and while I’m sure there’s a market for that, it’s got to be a much smaller market. My guess is the vast majority of “consumers” of this content would opt for the fake stuff if it took some of the risk off the table.
I can’t imagine a world where we didn’t ban ai generated csam, like, imagine being a politician and explaining that policy to your constituents. It’s just not happening. And i get the core point of that kind of legislation – the whole concept of csam needs the aura of prosecution to keep it from being normalized – and normalization would embolden worse crimes. But imagine if ai made real csam too much trouble to produce.
AI generated csam could put real csam out of business. If possession of fake csam had a lesser penalty than the real thing, the real stuff would be much harder to share, much less monetize. I don’t think we have the data to confirm this but my guess is that most pedophiles aren’t sociopaths and recognize their desires are wrong, and if you gave them a way to deal with it that didn’t actually hurt chicken, that would be huge. And you could seriously throw the book at anyone still going after the real thing when ai content exists.
Obviously that was supposed to be children not chicken but my phone preferred chicken and I’m leaving it.
I try to think about it this way. Simulated rape porn exists, and yet terrible people still upload actual recordings of rapes to porn sites. And despite the copious amounts of the fake stuff available all over the internet… rape statistics haven’t gone down and there’s still sexual assaults happening.
I don’t think porn causes rape btw, but I don’t think it prevents it either. It’s the same with CSAM.
Criminally horrible people are going to be horrible.
Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.
And yet, it is still actually illegal I’m every state. CSAM of any kind in any medium is legally identical. Hand drawn stick figures with ages written under them is enough for some judges/prosecutors.
Honestly, I am of the firm belief that the FBI should set up a portal that provides user account bound access to their seized materials. This may seem extreme and abhorrent, but it provides MANY benefits.
They are able to eliminate the black market for it by providing free, legal access to already existing materials, no more children will be harmed in the production of “new materials”.
They can mandate that accounts are only able to be made by those actively pursuing mental health treatments for their mental illness. It is a mental illness long before it is a crime.
They are able to monitor who is accessing and from where, and are able to coordinate efforts with mental health providers to give better treatment.
They can compile statistical data on the prevailing patterns of access to get a better analytical understanding of how those with the mental illness behave so they can better police those who still utilize extra-legal avenues.
Always keep in mind that this is a mental illness. Often times it is rooted in the person’s own traumatic past. Many were themselves victims of sexual abuse as children and are as much victims as the children they abuse. I am not, in ANY way, absolving them of the harm that they have done and they absolutely should repent for it. What I am attempting to articulate is that we need to, as a society, avoid vilifying them into boogy-people so we can justify hate and violence. They are people, they are mentally ill, they can be treated, and they can be healthy. It is no different than something like BPD, Malignant Narcissism, or Munchausen by Proxy. All can do real harm, all should face consequences of their harm, but those three are all so normalized at this point that unless the abuse results in death, most people will handwave the actions and push for treatment. Now I feel we have gotten too lax on these (and others) and are far too harsh on others. All mental illnesses deserve ardent and effective treatment.
Nay, I just replied to you in the context of the commenter. The other commenter stated about real life children so your point about hentai is irrelevant to him. I do know the legal definition of CSAM is the end result and not the act. And hence, why I stated that yours is a different discussion entirely.
Edit: Sorry I read it again and I think I didn’t get my point across very well. I think your point about artwork falls into the debate about the definition of CSAM. Why? Because the word abuse implies an abusive act is being done. But the current definition states that what matters is the end result only. This poses a problem in my opinion because it slightly touch your freedom of expression. By the current definition, art has its limit
What he probably means is that for a “photo”, an actual act of photography must be performed. While “artwork” can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting “image” is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.
But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…
One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images
I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.
So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.
The problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.
Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.
This isn’t true. AI can generate tan people if you show them the color tan and a pale person – or green people or purple people. That’s all ai does, whether it’s image or text generation – it can create things it hasn’t seen by smooshing together things it has seen.
And this is proven by reality: ai CAN generate csam, but it’s trained on that huge image database, which is constantly scanned for illegal content.
It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more
It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.
This article isn’t about Canada homeboy.
Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.
Similarly, you didn’t actually offer a counterpoint to any of my points.
I’m a professional artist and have no issue banning ai generated CSAM. People can call it self expression if they want, but that doesn’t change the real world consequences of it.
Allowing ai generated CSAM basically creates camouflage for real CSAM. As ai gets more advanced it will become harder to tell the difference. The scum making real CSAM will be emboldened to make even more because they can hide it amongst the increasing amounts of ai generated versions, or simply tag it as AI generated. Now authorities will have to sift through all of it trying to decipher what’s artifical and what isn’t.
The liklihood of them being able to identify, trace, and convict child abusers will become even more difficult as more and more of that material is generated and uploaded to various sites with real CSAM mixed in.
Even with hyper realistic paintings you can still tell it’s a painting. Anime loli stuff can never be mistaken for real CSAM. Do I find that sort of art distasteful? Yep. But it’s not creating an environment where real abusers can distribute CSAM and have a higher possibility of getting away with it.
I guess my question is, why would anyone continue to “consume” – or create – real csam? If fake and real are both illegal, but one involves minimal risk and 0 children, the only reason to create real csam is for the cruelty – and while I’m sure there’s a market for that, it’s got to be a much smaller market. My guess is the vast majority of “consumers” of this content would opt for the fake stuff if it took some of the risk off the table.
I can’t imagine a world where we didn’t ban ai generated csam, like, imagine being a politician and explaining that policy to your constituents. It’s just not happening. And i get the core point of that kind of legislation – the whole concept of csam needs the aura of prosecution to keep it from being normalized – and normalization would embolden worse crimes. But imagine if ai made real csam too much trouble to produce.
AI generated csam could put real csam out of business. If possession of fake csam had a lesser penalty than the real thing, the real stuff would be much harder to share, much less monetize. I don’t think we have the data to confirm this but my guess is that most pedophiles aren’t sociopaths and recognize their desires are wrong, and if you gave them a way to deal with it that didn’t actually hurt chicken, that would be huge. And you could seriously throw the book at anyone still going after the real thing when ai content exists.
Obviously that was supposed to be children not chicken but my phone preferred chicken and I’m leaving it.
I try to think about it this way. Simulated rape porn exists, and yet terrible people still upload actual recordings of rapes to porn sites. And despite the copious amounts of the fake stuff available all over the internet… rape statistics haven’t gone down and there’s still sexual assaults happening.
I don’t think porn causes rape btw, but I don’t think it prevents it either. It’s the same with CSAM.
Criminally horrible people are going to be horrible.
It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.
Just to play devil’s advocate:
What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)
I can downvote to prevent that if you like
Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.
And yet, it is still actually illegal I’m every state. CSAM of any kind in any medium is legally identical. Hand drawn stick figures with ages written under them is enough for some judges/prosecutors.
Honestly, I am of the firm belief that the FBI should set up a portal that provides user account bound access to their seized materials. This may seem extreme and abhorrent, but it provides MANY benefits.
Always keep in mind that this is a mental illness. Often times it is rooted in the person’s own traumatic past. Many were themselves victims of sexual abuse as children and are as much victims as the children they abuse. I am not, in ANY way, absolving them of the harm that they have done and they absolutely should repent for it. What I am attempting to articulate is that we need to, as a society, avoid vilifying them into boogy-people so we can justify hate and violence. They are people, they are mentally ill, they can be treated, and they can be healthy. It is no different than something like BPD, Malignant Narcissism, or Munchausen by Proxy. All can do real harm, all should face consequences of their harm, but those three are all so normalized at this point that unless the abuse results in death, most people will handwave the actions and push for treatment. Now I feel we have gotten too lax on these (and others) and are far too harsh on others. All mental illnesses deserve ardent and effective treatment.
Nay, I just replied to you in the context of the commenter. The other commenter stated about real life children so your point about hentai is irrelevant to him. I do know the legal definition of CSAM is the end result and not the act. And hence, why I stated that yours is a different discussion entirely.
Edit: Sorry I read it again and I think I didn’t get my point across very well. I think your point about artwork falls into the debate about the definition of CSAM. Why? Because the word abuse implies an abusive act is being done. But the current definition states that what matters is the end result only. This poses a problem in my opinion because it slightly touch your freedom of expression. By the current definition, art has its limit
What he probably means is that for a “photo”, an actual act of photography must be performed. While “artwork” can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting “image” is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.
Edit to clarify a few things.
This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.
I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.
And nothing was lost…
But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…
Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…