There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.
It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.
How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?
It’s not OK make CSAM.
The origin of CSAM does not make it acceptable.
The SA in CSAM is sexual abuse. Who is being sexually abused in order to make a drawing?
I am not an expert in any field relating to any of this by any means, but we can all agree that CSAM is unequivocally reprehensible. Thusly many people will have severe issues with anything that normalizes it even remotely. That would be my knee jerk response anyway.
Yes, but it’s wrong for very different reasons and severities. Murder vs murder porn, if you will. Both are bad and gross, but different, and that matters.
But that’s irrelevant to my question, which no one actually answered.
I am curious about people’s take on the difference between human creativity from memory vs AI “creativity” from training. The porn aspect is only relevant in that it’s an edge case that makes the debate meaningful.
There are laws today that you can’t copyright AI art, but we can copyright art that’s based on a person’s combined experiences. That seems arbitrary to me, and I’m trying to understand better.
I did answer your question. The answer is no.
There sure are a lot of groomers around here throwing out downvotes
There are also pedos pretending to be against AI generated child porn to cover their tracks.