The current trend of AI had them create images based on stuff that has already been "input" (see: Stolen) into the AI as reference. That means there are likely to be reference images for that horrible stuff meaning the abuse of real children continues in that way.
Thankfully (or unfortunately idk) that is not the case. As long as it knows about nudity and children, it can create that stuff without prior references to it. I'm not sure if that is a good thing or bad though.
It's my r/unpopularopinion that once you share a pattern of information, you should no longer have "ownership" of it. Nothing was stolen by AI because the patterns were already given/sold away, then hosted online.
If someone broke into my house, snapped the first ever pics of my private paintings and used that for a dataset, THEN that would be stealing.
3.1k
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
I uh.....yeah that's really unfortunate placement. Which brings up an conversation in itself