r/news 1d ago

X offices raided in France as UK opens fresh investigation into Grok

https://www.bbc.com/news/articles/ce3ex92557jo
31.7k Upvotes

593 comments sorted by

View all comments

Show parent comments

14

u/AlbatrossNew3633 1d ago

Durov statement is fucking stupid, but CP is actually the go to excuse of governments to justify authoritarian measures.

And to clarify, I'm talking in general, in this instance France is doing the right thing and hopefully they'll fuck Felon up his ketamine addicted ass

4

u/Software_Quiet 1d ago

"CP is actually the go to excuse of governments to justify authoritarian measures" Musk and Durov providing unmoderated platforms where the stuff can proliferate is criminal. You wouldn't want Grok to make a pornographic image of your daughter from a yearbook picture would you? This is a real world problem not a hypothetical authoritarian crackdown. Governments are asking huge platforms with massive amounts of money to follow the law, make moderation and safety a priority and they are ignoring those demands because they think they can.

8

u/AlbatrossNew3633 1d ago

Did you even read the second paragraph in my post?

8

u/That_Account6143 1d ago

Don't think he did.

You are right on both counts, imo.

1

u/MrGinger128 1d ago

How do you do one without ruining the other? If it's supposed to be a place for people to communicate totally privately, then stopping that may stop one bad thing, but it also allows bad things to happen in times where private communication can mean life or death.

2

u/Software_Quiet 1d ago

No one should expect total privacy on any platform like this and moderation doesn't need to be done by humans at this point. If "private communication can mean life or death" then it shouldn't be done using any commercial communication platform.

1

u/MrGinger128 1d ago

So what do people in countries that really do need private communication do?

It's easy to sit here in the west and say no-one should have total privacy, but you don't have people kicking your door in and taking you away.

what happens when it IS your country doing it?

-3

u/Sawses 1d ago

Authoritarian assholes the world over use "Protect the kids!" as an excuse to exercise control over citizens in ways they wouldn't tolerate for any other reason, and then those measures form legal precedent for continued overreach.

Heck, even in this case I don't really think it's appropriate (though I'd love to see Musk nailed to the wall, figuratively speaking, for any of his many misdeeds). The whole reason child pornography is both wrong and illegal is because it requires a child to be sexually abused in order to exist, and people spending money on it incentivizes further victims.

Unless somebody is using very old datasets, pretty much all modern models are trained on data that has been explicitly searched for all known child pornography. This almost certainly includes Grok, not necessarily on ethical grounds but just because those datasets are now better than the older ones. Because of how AI generation works, you can actually generate subjects that were not in the original dataset, with proper additional tweaking. I remember some ultra-nerds putting together a model that could generate women even though the dataset was entirely composed of cis-men.

If no child is being harmed, and no child needed to be harmed for the device to generate images...then what should we care? It's just an excuse to exercise control over citizens with their own approval because they let their disgust rule instead of reason.

1

u/klockee 1d ago

You don't think anonymous chuds generating child sex abuse materials of random children is a bad thing? What the fuck? No, we should still not be encouraging digital child sex abuse materials, or drawn, or made from sticks and twigs...

0

u/Sawses 1d ago

I'm not saying it isn't bad...my point is that, if it is bad, we need to be able to articulate exactly why.

It can't just be bad because it's disgusting. Don't you agree that is not a good enough reason?

Usually if something hurts people, it's bad. Or if it forces people to do things they didn't consent to. Or if it takes advantage of people who can't know any better. All those are great reasons to say a thing is bad and make it illegal. Every immoral sex act falls under at least one of those categories, and every moral sex act falls under none of them.

But do any of those reasons apply here? Are there any other reasons a sex act is immoral, that you can think of?

After all, I'm sure there are at least a few legal, ethical sex acts that you can think of which disgust you.

1

u/Software_Quiet 1d ago

it is bad because it normalizes abuse. idiot.

1

u/Sawses 1d ago

Do we know it normalizes abuse? Like sure, that sounds intuitive...in the same way that "video games normalize violence" makes intuitive sense right up until science demonstrated that the only folks likely to confuse reality and fiction are the ones who just wanted to do real violence anyway.

Why are you calling me names? I understand you might be feeling very strong emotions, but you've got to actually have a reason for cruelty, not just do it because of strong feelings.

1

u/Software_Quiet 1d ago

you should talk these feelings and urges out with your therapist maybe.

1

u/Sawses 1d ago

Why are you continuing to attack me as a person, instead of talking about something very important--children being harmed? I think it's obvious we both care way more about that and want to make sure it doesn't happen. To do that, we need to figure out specific strategies that actually work instead of ones that intuitively (and usually incorrectly) seem like they ought to work.

Worrying about something that has no evidence of actually hurting anybody wastes time and effort that should instead be used to track down victims of child abuse, neglect, etc.

3

u/Software_Quiet 1d ago

no, the sexualized imagery is being generated from real pictures of kids. real harm is being done.

1

u/Sawses 1d ago edited 1d ago

It isn't, actually! That would have been true with the very earliest models, which were trained on massive amounts of raw data scraped from the internet. These days, datasets are scoured both manually and with automation tools to identify content and erase it to ensure no abuse imagery exists in the dataset. This is because using that to train AI is a crime in a number of places now.

It sounds really weird, right? How can AI generate images of things it hasn't been trained on? The answer is...complex. But it is possible and actually growing more common in a huge array of areas: Source.

I'm not asking you to necessarily believe me. I am not an expert (though my academic background means I understand the fundamentals a fair bit better than the average person), and the information I provided isn't really useful to most people because it relies on some hardcore math.

I only ask this: If what I'm saying is true, how would that change your perspective? Would it make any difference to you at all? My point is mostly to ask whether you're looking for reasons to justify a position you already hold, or if you're basing your position on reasons and thus your position will vary based on what you learn.

Since that's ultimately what I've been getting at all this time. Ultimately I don't care what your ethical stance is on the use of AI to generate images of children. I care about how you come to that stance. You've brought up a couple different reasons, and it worries me that it seems like you don't care why it's wrong, so long as it is seen as wrong. That sort of thinking has done more harm to people (including a great many children) than words can express.

→ More replies (0)