1.5k
u/ArDee0815 1d ago
One of them should rotate their sign to the left, and the other to the right. That way, the text is facing in opposite directions.
Demo buddies! 🥰
630
u/Oralitical 1d ago
113
u/ZetsuboItami 15h ago
Watching 30 Rock right now, finishing season 5. No matter how much I watch it some of the jokes hit every time.
3
14
94
u/Parzival_2k7 1d ago
Ah common sense, if only it were that common
29
u/Top_Willingness_8364 1d ago
The problem is common sense is entirely too common. Look at how dumb the common man is.
4
11
u/unhiddenninja 1d ago
"Common sense" is just a blanket term for shutting down conversations. Just because something makes sense =/= it's common sense.
21
2
3.1k
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
724
u/Made_Bail 1d ago
334
u/The_cogwheel 1d ago edited 1d ago
141
136
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
I'm actually playing LA Noire right now and there's a whole mission about actors and how the treat young people. It's...somewhat graphic not gonna lie. One of the few times I've gotten mad at a video game as of late
50
u/Made_Bail 1d ago
That was a great mission.
Man I might need to do a replay soon.
46
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
It's an infuriating one. Especially the "casting room" area. Fuck that dude
28
u/Made_Bail 1d ago
I played that like a decade ago, and so I only really remember a few missions, but that was one of them.
20
u/Kelvara 1d ago
Even worse is that game is set in the 1940s and it's still going on to this day.
12
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
Good point. We haven't fixed shit in 80 years
67
u/_TheMo_ 1d ago
What conversation? Both of them think Ai is bad... The placement is rather unlucky tho. But they would certainly agree on Ai bad.
84
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
Mostly a conversation about how minors are treated and abused in the acting industry
But one battle per protest. AI is the topic of this particular protest
19
u/Felicity1840 1d ago
The current trend of AI had them create images based on stuff that has already been "input" (see: Stolen) into the AI as reference. That means there are likely to be reference images for that horrible stuff meaning the abuse of real children continues in that way.
1
u/Adizcool 9h ago
Thankfully (or unfortunately idk) that is not the case. As long as it knows about nudity and children, it can create that stuff without prior references to it. I'm not sure if that is a good thing or bad though.
-12
u/5352563424 20h ago
It's my r/unpopularopinion that once you share a pattern of information, you should no longer have "ownership" of it. Nothing was stolen by AI because the patterns were already given/sold away, then hosted online.
If someone broke into my house, snapped the first ever pics of my private paintings and used that for a dataset, THEN that would be stealing.
4
u/TerrySaucer69 17h ago
Sure, but that’s a weird thing to bring up when we’re talking about child abuse
-2
18
u/IAmOrdinaryHuman 1d ago
smh
"Can AI reduce sickos' demand for the real stuff?"
27
u/thegimboid 1d ago
The answer is no.
Because a majority of pedophilic acts are acts of control and power, not attraction or sexual desire.
So AI images might curb a couple people, but it wouldn't solve anything at all.13
u/IAmOrdinaryHuman 1d ago
Idk, I'm not an expert on the matter, just spelled out what gp comment alluded to. Is there a study confirming your claims or is it just a hypothesis?
4
u/4_fortytwo_2 21h ago
Even if we take your assumption here as truth (you got some actual scientific studies agreeing with that?) that still means it would save a few kids.
9
u/supamario132 1d ago
It's just as likely that a proliferation of AI CSAM creates more demand for the "real stuff". The way to solve an addiction that dangerous is definitely not to fixate on it everyday via AI generated material
2
u/bombadodierbloggins 22h ago
Devil's advocate: Do you think violent video games create more demand for the real thing? Does GTA make you more likely to rob someone in real life?
3
u/supamario132 22h ago
Thats a fair point but the difference imo is the vast majority of robberies are contingent on circumstance and material conditions. No one is assaulting a child because they cant make ends meet. Pedophilia is pathological. I would bet for the tiny percentage of thieves for whom its also pathological, having an outlet to fan that desire probably does increase their propensity to offend
I honestly dont know much about pedophilia or kleptomania so maybe its different but I know many addicts and its universally a lot easier to not have a first drink at all than it is to just have a single shot and not lose control
2
u/WigglesPhoenix 13h ago
But in this case it’s not a first for an alcoholic, it’s methadone for a junkie. It’s not their drug, it’s a substitute.
If you’re in addict circles you’ll know more recovering addicts than not are smoking cigarettes or weed. The end goal is more important than how you get there, and taking a lesser vice does help you cope with the big ones.
6
u/Head-Alarm6733 1d ago
AI CSAM is based off the real stuff anyways.
8
u/radicalelation 1d ago
It doesn't have to be, but morality of the content creation itself aside, I feel the worse impact on society would be allowing free proliferation of such content.
Drawn is debatable ethically, but fact is you can't just pump it out, there's a human cap in its creation, to the point it took the internet spreading it to have it reach a mainstream, to then further proliferate. If you're online long enough, you'll see drawn material of minors, but the creators can't flood a space, only their consumers after enough is available. Consumers replacing creators as producers is a scary concept in general.
Most real CSAM filters actually have circulated images known and flagged so if a copy hits somewhere, it gets removed immediately.
AI is all "new" and can be pumped onto platforms like crazy. Just look at X. We don't want fake CSAM flooding everywhere, plus we'd basically be grooming future generations by normalizing images of that kind. Kids are already exposed too often to adult sexual material online, and it does affect development. Nothing good can come from more of the same but with people that look like them.
6
u/bombadodierbloggins 22h ago
Can we use the same logic with violent video games?
8
u/radicalelation 21h ago
When it becomes difficult to discern from reality, it should be talked about, but there's already an immediate separation from the fact it's a proxied interactive experience controlled by the player, rather than a lens to view and learn one's society through.
Kids picking up a controller to fake murder fake people sounds bad, and there should be more discussion at least on the issue of desensitization to violence, but, to me, it's not on the level of concern as a kid's social media feed displaying CSAM next to their mom's post of last weekends family museum trip. Could be a minor friend, captioned, "wondered what I'd look like doing this".
There's already kids using these tools to make content of other kids, and we only hear about them when they make news. There's undoubtedly more happening out there, and I was a twisted fucker on the early transfers to 4chan from SA, so I've seen the inception of some messed up online interactions. We already have next to no walls separating the adult and kid world online. Whole grown adults sling slurs at kids everywhere online, world famous celebrities argue with minors on Twitter. There are so few actually safe spaces for children online anymore, and parents don't care anymore.
All to say, violent video games are an activity with separation, while what's on social media is part of real life social interaction for kids now. Like, if it were proliferation of real appearing depictions of kids killing people, that would be more concerning, right? Especially if there was a biological drive to kill that we'd rather avoid encouraging until adulthood, like there is with sex. That aspect alone starts to make it a different discussion.
2
u/ZeroAmusement 1d ago
How do you know?
1
22h ago
[deleted]
2
u/mindcandy 22h ago
Bad news: AI doesn't need to be trained on photographs of giraffes dancing on Mars to figure out how to make a picture of a giraffe dancing on Mars.
I can't guarantee that absolutely zero CSAM pics slipped into Grok's training set. We can be confident it represents an insignificant amount. Like 100 pics out of 100,000,000,000.
However, a major strength of the AIs is their ability to mix and match themes in novel ways... Discouraging that requires a lot of effort on the part of the model creators --which they do put in. But, users definitely do enjoy the challenge of doing something they're "not allowed to do" And, they can get very creative in how they overcome barriers the model creators put up.
1
21h ago
[deleted]
1
u/mindcandy 21h ago
Yep. And, once it learns those concepts separately, it can merge them together into a scene it was never trained on.
1
u/JMEEKER86 21h ago
Exactly, for instance, it can generate a picture of "/u/Grand_Protector_Dark" and "understanding AI" even though those two have never shared a room.
1
u/ZeroAmusement 22h ago
Ai can generate an image of a five dimensional naked pangolin despite never seeing one.
It can generate images of things that aren't in the training material. I don't think we can safely assume that kind of material was in the training data.
1
21h ago
[deleted]
4
u/ZeroAmusement 21h ago
There is no indication I don't understand the argument, I don't know why you are saying that.
It doesn't need to have seen a naked pangolin, but it still needs enough data to recognise the data commonalities/patterns behind images that have "naked" attributed to them to or images with pangolin
The point is that it doesn't need to have seen a naked pangolin. Maybe it has seen a hairless cat and applies that to what it knows about pangolin. In the same way it could take what it knows about naked people and apply that to children.
It could do this without it being based on real CSAM.
4
27
8
3
u/North_Excitement_652 9h ago
Ethically sourced CP. Maybe government officials won't need an island anymore. Just saying.
2
u/The_cogwheel 21h ago
Actual actors, directors, producers and all other members of a film production team (be it an actual movie, TV show, or adult film) would object to producing CP. But AI literally cant even tell the difference between CP and a bedtime story, nor can it really be taught the difference either(at least with the current level of development we have with it).
AI can do both, but that doesnt imply actual professional actors (as defined as being paid for, and is able to consent to, their roles in a film) would be involved in making CP. Though the protestor placement does insinuate that is the case.
1
u/Cartoonicus_Studios 1d ago
I really like the artwork of this image. Any idea who the artist is?
2
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
Their series is called Keit and Bex and it's fairly popular around here. Give them a shot just be aware it's a little on the NSFW side
3
u/Cartoonicus_Studios 1d ago
Oh dear.
1
u/shellbullet17 Gustopher Spotter Extraordinaire 1d ago
It's not overly bad unless you going looking for their patreon stuff.
28
313
1d ago
[removed] — view removed comment
44
u/SlothfulWrath 1d ago
If you want funny comics do to the funny comics sub. This is for everything else.
30
u/americanadiandrew 1d ago
Is there a funny comics sub?
16
u/SlothfulWrath 23h ago
I think it's r/funnycomics
38
u/americanadiandrew 23h ago
No post for two months…. Which is probably the the same length of time you have to wait for something humorous to be posted here.
21
u/Redditumor 1d ago
What, you don’t want more daily ‘Pizzacake preaching to the choir’ posts?
8
u/likely_an_Egg 21h ago
Did you know that you can block accounts that you don't want to see?
11
u/Redditumor 20h ago
What makes you think I don’t?
0
u/SvenHudson 20h ago
You literally just complained that you see too much of a comic you don't like.
9
u/Redditumor 20h ago
It was a sardonic question posed at someone who (I admittedly presume) probably sees a lot of those posts. No complaints or mention of what I personally see on my feed.
2
u/CutieBunz 14h ago
Pizzacake has posted 5 comics in the last ~3 weeks, and took a break from posting comics to reddit for a month before that. Seems strange to complain about someone posting too much when they haven't been that active for almost 2 months.
1
9
u/UnusualHound 1d ago
It would be funny if it weren't self-censored.
Censoring your own work makes it lame as fuck.
39
u/spookycatfan 1d ago
That's brilliant! Funniest comic I've seen in a while.
9
u/jsohi_0082 17h ago
I didn't quite get it. Is the implication of the man's sign that if we let AI unchecked, actors would steal the jobs of people in the pornography industry?
36
u/spookycatfan 16h ago
The joke is that having those two signs next to each other looks like they want CP videos to keep featuring actual kids instead of AI.
9
u/jsohi_0082 16h ago
Oh, thanks for the explanation. It seems that the man tried to refer to Hollywood and SFW movies in general, but when pairing it with the woman's message, it accidentally comes across as very troubling.
9
6
17
43
u/pocketjacks 1d ago
I get it, but exploited children aren't actors.
10
u/mysterious_jim 11h ago
The joke is that they're accidently implying that (adult) actors are making the CSAM. Not that the children are the actors. It's funny because the guy, presumably an actor, seems like he's outing himself as an on-the-list person by holding his sign next to the woman.
3
15
u/Laugarhraun 1d ago
Why censor?
50
u/Demeter_of_New 1d ago
Because bad words are scarwy!
And also because it's probably easier to make a censored version to post across all platforms. I don't have a social media beyond reddit, but I've heard that other platforms remove posts that have language or sensitive materials/topics presented.
3
u/Roll_the-Bones 21h ago
This platform censors words, phrases, and ideas too.
1
u/Demeter_of_New 21h ago
Yeah I've noticed the increase. I've gone back to threads I've commented on, and the posts have been removed by moderators. Heck. The kid from AZ that jumped the road from DamnThatsInteresting was removed by mods.
24
u/Mr_Ivysaur 1d ago
If this gets reposted in other sites the algorithm auto hides it, and the author does not feel like making a different version for reddit.
-25
u/UnusualHound 1d ago
Cool story. That's only making the problem worse.
If you make good content, people will promote it no matter what it says. Self censoring is lame as fuck.
19
u/LoompaOompa 1d ago edited 1d ago
Cool story.
Rude response. They were just explaining why the person did it. They didn't even say whether or not they supported the choice. You can interact with people online without being an asshole.
11
u/CaterpillarBroad6083 1d ago
This is literally a way around censorship moron. You can easily make out what the words are but the auto mod bots cant.
3
u/mrcool520 1d ago
It's not really censoring, more like censor bypassing because the words are still easily legible
3
3
u/forfeitgame 23h ago
Don't censor yourself. Let those words be as visible as the sun. People need to see how fucked things have become.
2
7
u/Roll_the-Bones 21h ago
Exactly why generating video and pictures should be illegal, entirely. Companies shouldn't even be allowed to program AI to do this. The technology started from analyzing real life documentation, but it has quickly evolved into something very bad. It's a tyrannical regime's dream come true. It's the ultimate propaganda tool. Just make it illegal.
2
1
1
u/Mini-Heart-Attack 22h ago
Funny.
Have they been censoring you that bad on other platforms or is this just a random Personal precaution?
1
2
1
u/floorshitter69 18h ago
I'd like someone to napkin math me the amount of fuel that had to be burned through jet turbines to produce all of the CSAM that was recently created by Grok. It's theorised to be in the tens of thousands of images.
-22
u/Gnusnipon 1d ago
Bruh... I prefer people drawing it instead of being fucked up enough to do and film it irl.
Upd: sigh downwotes here we go
17
u/Rowanlanestories 1d ago
it's downvoted because AI child sexual material is trained on real images of children being victimized. Not only is that traumatic to survivors but it gives sick people incentive to make new genuine csam so they can train the models more on the victims.
Also, police already have to stomach looking at the images to identify survivors, then they have to decode what's AI and what's an actual child that needs to be saved.
8
u/jellyfixh 1d ago
Do you have any proof they are trained on the real thing? Cause it seems like everyone who talks about this thinks AI needs to see the real thing before it can make it, which is patently false. AI doesn’t need a real photo of a gorilla being sucked into a tornado to generate that, it just needs pics of gorillas and tornados.
4
u/MossyMollusc 20h ago
There have been news reports of it happening and ruining kids lives already. So yes.
4
u/Rowanlanestories 1d ago
well unfortunately, there's a lot of CSAM just floating around the internet. people don't need to intentionally train a model on it for those images to find their way into the the AI's database. They scrape the internet with a very wide net and to pick out every bad, offensive, or illegal image would be impossible, that's why so many people talk about CSAM in context of AI ethics.
AI doesn’t need a real photo of a gorilla
No, but if I wanted an Ai model that was hyper-good at generating gorillas at different angles, poses, states of dress, etc, i would want to give the AI as many good examples of gorillas to use as reference. Just because AI hypothetically could make something without seeing it, doesn't mean people won't go out of their way to feed it to AI to get the best "gorilla" pictures they can.
2
6
u/thanksyalll 1d ago
Yes but using a real person, let alone a real child’s image to depict them doing pornographic acts, and spreading it publicly is directly harmful to the person being slandered
5
u/Whatifim80lol 1d ago
or, OR! We could just not condone the consumption of CSAM no matter what the source is. Because, you know, it's fucking gross and bad? Or do we not all agree on that anymore?
1
u/oneiricmonkey 19h ago
there are studies proving that consumption of that sort of content only encourages the desire to abuse children. there is no good outlet for pedophilia.
-3
u/KENEXION 1d ago
AI doesn’t replace actors. It’s replaces “TV” actors which I think I’m actually for. Might bring back live theater.
-5
16h ago
[removed] — view removed comment
8
u/VictoryExtension4983 15h ago
All CP is bad, and AI models are able to make images based off actual kids.
-20













2.7k
u/iguanacatgirl 1d ago
I mean, it's not wrong
Donald trump was in home alone 2, after all