- cross-posted to:
- stable_diffusion@lemmy.dbzer0.com
- cross-posted to:
- stable_diffusion@lemmy.dbzer0.com
AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
They’re not pictures of real people, proceeding against it on that basis undermines the point and makes them look like idiots. It should be banned on principle but ffs there’s got to be a way that doesn’t look reactionary and foolish.
Except when they are pictures of real people doing a body swap
That isn’t at all what an AI generated image is. People have been doing that for better than 50 years.
🤦♀️ I obviously mean the replaced portions of the body are AI generated, like photoshop and various other tools have been using.
Thats been possible since before photoshop and certainly is possible after
Thats been possible since before photoshop
possible before, easy as 1 command now.
Shouldn’t that already be covered under revenge porn laws? At least the distribution side of it.
But aren’t these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.
Removed by mod
No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.
Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.
Have a bunch of pedophiles doing the training, and you end up with “BeyondCP”.
deleted by creator
My question is where did they get the training data for a sufficiently advance CP image generator. Unless it’s just ai porn with kids faces? Which is still creepy, but I guess there are tons of pictures that people post of their own kids?
Manga, manwha(?) CG sets etc of shota/loli. Sprinkle in some general child statistics for height, weight etc . And I’m sure social media helped as well, with people making accounts for their babies for God sake.
Plenty of “data” I’m sure to train up an AI.
Wouldn’t put it past some suck fucks to feed undesirable content into an AI training
It’s unacceptable in any form.
Humans have been raping kids since our inception. Childhood is a relatively modern concept that young adults are now apart of. It’s an ugly and pervasive subject that needs further study to reduce child harm.
Nobody here said anything otherwise.
Whatever you wanted to say I do not unserstand your point.
It’s not about protecting directly real people images, it’s about preventing Pedophilia.
Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI, will worsen the ADDICTION!
No one chooses to be a pedophile - as far as we know it’s just the unluckiest possible sexual attraction.
Stigmatizing it won’t help anyone - those people need help and everything that doesn’t hurt real children until they get themselves that psychological help is good in my book
Wtf are you talking about.
They may not be choosing it, but accepting they exist and ignoring them is a way to say you accept Pedophilia.
They should know that they do something wrong and that there is help (at lest in France there are processes to try and help them, and it’s about the same as an addiction but with other content).
Of course they know that what they do is wrong
But thoughts aren’t criminal and if they manage to control their attraction in a way that hurts no-one (like this ai-generated stuff or the child-sex-dolls or other things that are very creepy to almost everyone else) then they should totally be normal members of society
I’m not saying that they should be proud of it and see it as something normal - it’s definitely a psychological issue - but denying them stuff that helps them satisfy their urges without hurting anyone should be a good step to help them face their sexuality instead of suppressing them
I think it should totally be possible to access things like it but it should also be necessary to link to a helpline like the one you mentioned (Germany has something like that) similar to how a helpline for suicide is necessary if someone mentions that subject
Well, it’s not just that “simple” as to satisfy their urges.
An addiction, in a way is the more someone consumes the drug, the more they want that drug.
For some Pedophilia may not be directly inpactful to people around them, however allowing access to such things may worsen their condition. And direct them more towards such content, instead of replacing it with adult pornography.
Help should be available as you said, with a number to call for advice or help. France I think also has such number, not sure if it’s still online, I just saw some article from some years ago talking about that.
Are you saying they’re “born this way?” Has that been demonstrated anywhere?
Isn’t AI generated better than content sourced from real life? It could actually drive a reduction in child sexual abuse instances due to offenders leveraging alternative sources.
I wonder what it’s trained on 🤮
Based on my understanding of how current diffusion models work, you actually don’t need to train it on CP. As long as it knows how humans look like without clothes and how children look like even if fully clothed with abayas and stuff, it can make the relation and generate CP when asked to.
Just to be clear, I’m totally against any form of CP and CSAM. Just explaining how the tech works.
I see what you’re saying, but ai has yet to offset regular porn production at all. There’s no reason I see to think accepting ai cp would do anything but normalize it and make it more accessible, possibly increasing demand for the real stuff.
Also, the ai models need to be trained on something…
Alternatively it could become an indoctrination pipeline
This feels like a double edged sword.
No it won’t.
Fueling the child addiction will harden the persons mental health problem.
One of the ways to help a person with such addiction, is to replace it with adult pornography.
Fueling it with more of that content won’t do any good.
Mind linking the study you’re refering here?
Majority of pedofiles never offend and most of the people in jail for child abuse are plain old rapists with no particular interest in kids per se, they’re just an easy target.
This is the same old “violent games makes people violent” -argument all over again.
Wtf.
I’m not talking about all of them being violent or creating issues to all the children in the world.
Everyone is different, and only a small % is violent.
And no it’s not the same as violent games, nor was there anything in my argument pointing to that, wtf!
If you want a study or whatever here https://www.cairn.info/revue-l-information-psychiatrique-2011-2-page-133.htm It’s a bit old, things may have changed.
You mind quoting the part where it talks about the role of adult pornography and how fueling this “addiction” makes it worse?
It was in a news/investigation journal on TV at one point. I can’t because I don’t remember the time and journal.
They talked about people being put in some sort of helping home. There they helped the people get away from such content and replacing it with adult content.
This was pretty recent if I remember well. So the article above being from 2012 may not have all studies and information.
deleted by creator
There is no part to quote for worsening, there are plenty of studies on addiction to pornography, and addiction to child abuse/porn is the same thing.
However child abuse/pornography is in a way a different content from adult pornography and can both have their level of addiction.
Then why do you link that if it has almost nothing to do with what we’re discussing here?
I’m not sure to follow you, why wouldn’t it have a link?
It’s obviously stated there is psychological help.
Doesn’t it answer your first link question?
If you want more link go look on Wikipedia, there are 100ds of them.
Won’t bother with you not reading them.
In the United States, there are significantly greater dangers to kids than AI porn. Hunger, poverty and the climate crisis come to mind.
If we are refusing to address these for ideological reasons (e.g. because its socialism ) then the established system itself is a threat to kids.
Priorities.
Pandora’s digital box has been opened. And I dont think this one ends with everything going back in the box.
This is the best summary I could come up with:
On Wednesday, American attorneys general from all 50 states and four territories sent a letter to Congress urging lawmakers to establish an expert commission to study how generative AI can be used to exploit children through child sexual abuse material (CSAM).
In particular, open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability.
Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation’s top prosecutors.
Establishing a proper balance between the necessity of protecting children from exploitation and not unduly hamstringing a rapidly unfolding tech field (or impinging on individual rights) may be difficult in practice, which is likely why the attorneys general recommend the creation of a commission to study any potential regulation.
In the past, some well-intentioned battles against CSAM in technology have included controversial side effects, opening doors for potential overreach that could affect the privacy and rights of law-abiding people.
Similarly, the letter’s authors use a dramatic call to action to convey the depth of their concern: "We are engaged in a race against time to protect the children of our country from the dangers of AI.
The original article contains 960 words, the summary contains 225 words. Saved 77%. I’m a bot and I’m open source!
A lot of people here really defending child pornography
It’s an obvious overreach.
An AI generated image is essentially the solution to a math problem. Say the images are/become illegal. Is it then also illegal to possess the input to that equation? The input can be used to perfectly replicate the illegal image after all. What if I change a word in the prompt such that the subject of the generated image becomes clothed? Is that then suddenly legal?
I understand the concern, but it’s just incredibly messy to legislate what amounts to thought crimes.
Maybe we could do something to discourage distribution, but the law would have to be very carefully worded to prevent abuse.
And everyone pointing that out gets downvoted into oblivion, rofl. I hate the internet and the sick degenerates on it.