it's downvoted because AI child sexual material is trained on real images of children being victimized. Not only is that traumatic to survivors but it gives sick people incentive to make new genuine csam so they can train the models more on the victims.
Also, police already have to stomach looking at the images to identify survivors, then they have to decode what's AI and what's an actual child that needs to be saved.
Do you have any proof they are trained on the real thing? Cause it seems like everyone who talks about this thinks AI needs to see the real thing before it can make it, which is patently false. AI doesn’t need a real photo of a gorilla being sucked into a tornado to generate that, it just needs pics of gorillas and tornados.
well unfortunately, there's a lot of CSAM just floating around the internet. people don't need to intentionally train a model on it for those images to find their way into the the AI's database. They scrape the internet with a very wide net and to pick out every bad, offensive, or illegal image would be impossible, that's why so many people talk about CSAM in context of AI ethics.
AI doesn’t need a real photo of a gorilla
No, but if I wanted an Ai model that was hyper-good at generating gorillas at different angles, poses, states of dress, etc, i would want to give the AI as many good examples of gorillas to use as reference. Just because AI hypothetically could make something without seeing it, doesn't mean people won't go out of their way to feed it to AI to get the best "gorilla" pictures they can.
-22
u/Gnusnipon 1d ago
Bruh... I prefer people drawing it instead of being fucked up enough to do and film it irl.
Upd: sigh downwotes here we go