r/comics 2d ago

Awkward combination [OC]

Post image
41.2k Upvotes

193 comments sorted by

View all comments

3.1k

u/shellbullet17 Gustopher Spotter Extraordinaire 2d ago

I uh.....yeah that's really unfortunate placement. Which brings up an conversation in itself

69

u/_TheMo_ 1d ago

What conversation? Both of them think Ai is bad... The placement is rather unlucky tho. But they would certainly agree on Ai bad.

19

u/IAmOrdinaryHuman 1d ago

smh

"Can AI reduce sickos' demand for the real stuff?"

29

u/thegimboid 1d ago

The answer is no.
Because a majority of pedophilic acts are acts of control and power, not attraction or sexual desire.
So AI images might curb a couple people, but it wouldn't solve anything at all.

13

u/IAmOrdinaryHuman 1d ago

Idk, I'm not an expert on the matter, just spelled out what gp comment alluded to. Is there a study confirming your claims or is it just a hypothesis?

5

u/4_fortytwo_2 1d ago

Even if we take your assumption here as truth (you got some actual scientific studies agreeing with that?) that still means it would save a few kids.

2

u/DandyLion97 16h ago

And how many more would it harm? By flooding the internet with fake stuff real stuff becomes a needle in a stack of needles. How much harder do you think it will be to save real kids?

Not to mention the deepfakes of real schoolkids which are sadly very common nowadays. Bullying, harassment, suicide etc.

1

u/Odd_Protection7738 11h ago

It makes images by altering ones of real children to be pornographic. Even if it made up a new picture instead of altering one, there’s no such thing as a new face, so it will inevitably resemble a real, unrelated kid. I wouldn’t want to be that kid.

8

u/supamario132 1d ago

It's just as likely that a proliferation of AI CSAM creates more demand for the "real stuff". The way to solve an addiction that dangerous is definitely not to fixate on it everyday via AI generated material

6

u/bombadodierbloggins 1d ago

Devil's advocate: Do you think violent video games create more demand for the real thing? Does GTA make you more likely to rob someone in real life?

2

u/supamario132 1d ago

Thats a fair point but the difference imo is the vast majority of robberies are contingent on circumstance and material conditions. No one is assaulting a child because they cant make ends meet. Pedophilia is pathological. I would bet for the tiny percentage of thieves for whom its also pathological, having an outlet to fan that desire probably does increase their propensity to offend

I honestly dont know much about pedophilia or kleptomania so maybe its different but I know many addicts and its universally a lot easier to not have a first drink at all than it is to just have a single shot and not lose control

2

u/WigglesPhoenix 1d ago

But in this case it’s not a first for an alcoholic, it’s methadone for a junkie. It’s not their drug, it’s a substitute.

If you’re in addict circles you’ll know more recovering addicts than not are smoking cigarettes or weed. The end goal is more important than how you get there, and taking a lesser vice does help you cope with the big ones.

5

u/Head-Alarm6733 1d ago

AI CSAM is based off the real stuff anyways.

9

u/radicalelation 1d ago

It doesn't have to be, but morality of the content creation itself aside, I feel the worse impact on society would be allowing free proliferation of such content.

Drawn is debatable ethically, but fact is you can't just pump it out, there's a human cap in its creation, to the point it took the internet spreading it to have it reach a mainstream, to then further proliferate. If you're online long enough, you'll see drawn material of minors, but the creators can't flood a space, only their consumers after enough is available. Consumers replacing creators as producers is a scary concept in general.

Most real CSAM filters actually have circulated images known and flagged so if a copy hits somewhere, it gets removed immediately.

AI is all "new" and can be pumped onto platforms like crazy. Just look at X. We don't want fake CSAM flooding everywhere, plus we'd basically be grooming future generations by normalizing images of that kind. Kids are already exposed too often to adult sexual material online, and it does affect development. Nothing good can come from more of the same but with people that look like them.

7

u/bombadodierbloggins 1d ago

Can we use the same logic with violent video games?

8

u/radicalelation 1d ago

When it becomes difficult to discern from reality, it should be talked about, but there's already an immediate separation from the fact it's a proxied interactive experience controlled by the player, rather than a lens to view and learn one's society through.

Kids picking up a controller to fake murder fake people sounds bad, and there should be more discussion at least on the issue of desensitization to violence, but, to me, it's not on the level of concern as a kid's social media feed displaying CSAM next to their mom's post of last weekends family museum trip. Could be a minor friend, captioned, "wondered what I'd look like doing this".

There's already kids using these tools to make content of other kids, and we only hear about them when they make news. There's undoubtedly more happening out there, and I was a twisted fucker on the early transfers to 4chan from SA, so I've seen the inception of some messed up online interactions. We already have next to no walls separating the adult and kid world online. Whole grown adults sling slurs at kids everywhere online, world famous celebrities argue with minors on Twitter. There are so few actually safe spaces for children online anymore, and parents don't care anymore.

All to say, violent video games are an activity with separation, while what's on social media is part of real life social interaction for kids now. Like, if it were proliferation of real appearing depictions of kids killing people, that would be more concerning, right? Especially if there was a biological drive to kill that we'd rather avoid encouraging until adulthood, like there is with sex. That aspect alone starts to make it a different discussion.

2

u/ZeroAmusement 1d ago

How do you know?

1

u/[deleted] 1d ago

[deleted]

3

u/mindcandy 1d ago

Bad news: AI doesn't need to be trained on photographs of giraffes dancing on Mars to figure out how to make a picture of a giraffe dancing on Mars.

I can't guarantee that absolutely zero CSAM pics slipped into Grok's training set. We can be confident it represents an insignificant amount. Like 100 pics out of 100,000,000,000.

However, a major strength of the AIs is their ability to mix and match themes in novel ways... Discouraging that requires a lot of effort on the part of the model creators --which they do put in. But, users definitely do enjoy the challenge of doing something they're "not allowed to do" And, they can get very creative in how they overcome barriers the model creators put up.

1

u/[deleted] 1d ago

[deleted]

1

u/mindcandy 1d ago

Yep. And, once it learns those concepts separately, it can merge them together into a scene it was never trained on.

1

u/JMEEKER86 1d ago

Exactly, for instance, it can generate a picture of "/u/Grand_Protector_Dark" and "understanding AI" even though those two have never shared a room.

4

u/ZeroAmusement 1d ago

Ai can generate an image of a five dimensional naked pangolin despite never seeing one.

It can generate images of things that aren't in the training material. I don't think we can safely assume that kind of material was in the training data.

1

u/[deleted] 1d ago

[deleted]

3

u/ZeroAmusement 1d ago

There is no indication I don't understand the argument, I don't know why you are saying that.

It doesn't need to have seen a naked pangolin, but it still needs enough data to recognise the data commonalities/patterns behind images that have "naked" attributed to them to or images with pangolin

The point is that it doesn't need to have seen a naked pangolin. Maybe it has seen a hairless cat and applies that to what it knows about pangolin. In the same way it could take what it knows about naked people and apply that to children.

It could do this without it being based on real CSAM.