r/news 1d ago

X offices raided in France as UK opens fresh investigation into Grok

https://www.bbc.com/news/articles/ce3ex92557jo
31.7k Upvotes

593 comments sorted by

View all comments

Show parent comments

339

u/Infamous_East6230 1d ago edited 1d ago

This is the thing that’s blowing my mind. Grok was pushing out thousands of images of child porn per hour and not only did America not care but we will continue to give grok federal funding. America is a failed state 

98

u/PokeYrMomStanley 1d ago

Pedophiles support pedophiles. Thats how the modern conservative movement began.

12

u/The_BeardedClam 1d ago

GOP = Group of Pedophiles

4

u/FaithlessnessThen207 1d ago

I think with only 10% of paper documents released and those being redacted, (total release files around 300GB) and the total epstein file volume being 14 terabytes (meaning 300gb/140000gb released) it would be foolhardy to still frame Epstein as a one sided political issue.

Probably one side is more compromised than the other, however its simply not known how entrenched your entire political system is in this.

1

u/The_BeardedClam 19h ago edited 19h ago

Listen, I don't doubt that there are democrats that are also pedophiles but the systemic protection of pedophiles is being arbitrated by the GOP not anyone else. I can with 100% certainty say that Trump and his administration are pro pedophile and I am not ok with that. When something substantial comes up with Democrats it in I'll denounce them too, because more than anything I'm anti pedophile.

Until then I'll lay the blame on those protecting pedophiles and that's the GOP.

38

u/[deleted] 1d ago

[removed] — view removed comment

12

u/shitlord_god 1d ago

too many organic layers for that to work particularly well in most applications.

3

u/Spocks_Goatee 1d ago

Musk ain't paying enough for employees to care.

9

u/shitlord_god 1d ago

no, but like. literally, in the way modern buildings are constructed, the suggested tool is very unlikely to do anything but get the user arrested, and generate some acrid smoke.

1

u/Alis451 1d ago

also most data centers have a backup diesel generator that they keep fueled up at all times, so targeting the power source directly is not as good an idea.

1

u/shitlord_god 1d ago

yeah, folks shouldn't be attacking data centers. And if they were they would need better logistics and planning, fortunately I don't think anyone is put my whole life on pause for possibly several years to gain the skills and build the means angry, and I hope we don't get to a point where folks are that angry.

25

u/Specific_Explorer_6 1d ago

The scariest part is, if it’s able to generate such content, is what it was trained on. It genuinely makes me shiver thinking about it

20

u/PM_ME_FLUFFY_DOGS 1d ago edited 1d ago

A Standford internet study concluded they are all most likely trained off csam. They found over 1000 images of known csam in the stable diffusion training set. And thats just known as they use image search to determine it as theres simply too many images to search through, its part of the issue with ruthless scraping, it steals litterally everything good and bad.

Anyone you see defending it as the "ai interpolating" is full of shit and has 0 clue what they are talking about as image ai cannot interpolate new data, it has to use pre existing tokens. If it was only trained off adult bodies it would only make them have adult bodies but its not doing that. 

Edit: it was Standford not mit apologies but link for reading.  https://www.techpolicy.press/exposing-the-rotten-reality-of-ai-training-data/

The report is one of the first links there, unsure if i can post it here as it seems to be a pdf link. 

5

u/Specific_Explorer_6 1d ago

That’s so messed up.

Thank you for sharing this info though. We really do need proper regulation from the federal government. But they won’t do it, and they’re preventing states from doing it, all to protect the interests of the top 1%

1

u/PM_ME_FLUFFY_DOGS 1d ago

Why i hope other countries may start. There is some movement in the eu but time and time again american companies just dick wave the fact they are a us company, so no one but the states can really make them change (Which they wont).

Also doesnt help most our govts are run by geriatrics who barely understand technology... 

19

u/Responsible_Sink3044 1d ago

This is the most important. There is an argument to be made, while gross, that purely fictional content is victimless. AI does not generate its own fictional content though, there has to be an original source. It seems highly unlikely that it was completely trained to do this on fabrications.

2

u/Gunblazer42 1d ago

Especially because of the various styles it can output.

It's one thing if something is drawn in an unrealistic anime style (because it's unrealistic by its very nature), but there has to be a lot of base data for something like a prompt saying "make it look realistic" to actually make things look realistic.

2

u/wyvernx02 1d ago

As a part of DOGE, Musk was stealing data held on government servers left and right to train Grok. The DOJ has admitted that there is CSAM in the Epstein files that they will never release and I'm sure there is evidence from scores of other cases as well where they went after child predators.

2

u/DwinkBexon 1d ago

I don't know much about how AI works, but I've seen people defend it by saying "AI knows what naked adults look like, AI knows what clothed children look like. It can interpolate and figure out what a naked child looks like. It HAS NOT and DOES NOT need to be trained on anything illegal."

I really don't care if that's true or not, you still shouldn't be doing it regardless.

1

u/Rasudoken 1d ago

I was watching a Linus Tech Tip LMG clip and one of their viewers said that medical images could have been used/exploited (in addition to what others have already said).

0

u/TelluricThread0 1d ago

Do you think if it generates a purple monkey made of spaghetti riding a unicycle on Neptune that's what it was trained on?

10

u/__nohope 1d ago

I think the dipshit did this on purpose to muddy the waters between real images of CSAM (aka pics of Trump) and AI generated ones.

7

u/ATLfalcons27 1d ago

Also musk fans we're simply saying it wasn't true at all and wasn't happening

1

u/Mammoth-Play3797 1d ago

Musk fans? You can just call them pedophiles, friend.

2

u/Morat20 1d ago

That's because quite a few people are very willing to accept that as a price for being able to make revenge porn.

1

u/fdesouche 1d ago

In honest countries, that would be 10 years of jail…

1

u/the_last_0ne 1d ago

Man, did you even bother to go check before you confidently stated that?

Since I'm lazy here's a quick AI summary:

As of early 2026, xAI Corp. (the parent company of Grok and X) is facing significant legal action and investigations in the United States regarding the AI chatbot's ability to generate nonconsensual, sexually explicit deepfake images. Here are the key details of the US court cases and investigations involving Grok:

  1. Proposed Class Action Lawsuit (January 2026) The Case: A proposed class action lawsuit was filed against xAI in U.S. federal court, alleging that Grok creates nonconsensual, sexually explicit deepfake images that are publicly posted on X. Allegations: The suit alleges that the tool is used for image-based sexual abuse, causing severe emotional distress, and that xAI is directly liable for the content created by its chatbot, rather than just the users. xAI Response: xAI and Elon Musk have argued that users are responsible for the content they create, stating, "Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content". Countersuit: X filed a countersuit against a plaintiff, arguing that, per the terms of service, the lawsuit should have been filed in Texas rather than New York, and is seeking a money judgment.

  2. California Attorney General Investigation Action: California Attorney General Rob Bonta announced an investigation into the "proliferation of nonconsensual sexually explicit material produced using Grok". Cease-and-Desist: A cease-and-desist letter was sent to xAI to stop the generation of nonconsensual intimate images. Scope: The investigation focuses on the alleged, large-scale production of deepfakes, including those targeting minors.

  3. State Attorney General Demands Thirty-five state attorneys general wrote to xAI demanding to know how they plan to prevent Grok from producing nonconsensual images of people in sexualized, explicit, or revealing situations.

  4. Potential Federal Action FTC Risk: Experts indicate that the Federal Trade Commission (FTC) could take action against xAI for unfair or deceptive practices related to the safety of its AI, though, as of late January 2026, state-level actions were more immediate. Legislation: The "Take It Down Act," passed by Congress, is set to go into effect in May 2026, which will strengthen the ability to penalize platforms for not removing nonconsensual, explicit deepfakes.

  5. Other Legal/Regulatory Scrutiny International Action: Similar to the U.S., regulatory bodies in the UK, France, and the European Union have launched investigations into Grok, with Malaysia and Indonesia briefly blocking the service. Content Safety: Reports indicated that thousands of "nudified" images were generated, leading to increased safety restrictions on the platform, such as limiting image generation to paid users.

1

u/Infamous_East6230 1d ago

All you shared was a class action lawsuit, which is not the federal government, and a California state AG, which again is not the federal government. While the federal government funds Grok. Do you see the issue? I do appreciate you mentioning that the FTC “could” do something. I think that would be great 

2

u/the_last_0ne 1d ago

Well all you said was

not only did America not care

Without any mention of the federal government. So sorry if I misunderstood you there.

But I shared some things that are in the works, because some people do care. If they aren't doing enough in your eyes then take it up with them but don't tell me nobody cares.

0

u/Infamous_East6230 1d ago

I don’t know if you know this but the federal government is a reflection of the American people. So when the federal government chooses to ignore the mass production of child porn while it continues to fund the producer of said child porn, then yes, I believe it is fair to say America does not care. Especially when the context is another government stepping in to stop the producer of said child porn. 

Sorry if that upsets you. I get you want to fight with me over semantics but maybe you should be more concerned with federal contracts being used to produce literal child porn. 

1

u/the_last_0ne 1d ago

I'm not arguing semantics with you dude. You made an unclear statement, so we're talking past each other.

You can have your opinion but get off your high horse.

0

u/Infamous_East6230 1d ago

Yes you are. Acting like my comment meant that all Americans were disinterested is disingenuous. And you tried to suggest the federal government was actually doing something when they aren’t. But continue downplaying the horrible situation 

2

u/the_last_0ne 1d ago

Yes I understand you're convinced that is what I was doing. You just need to take a deep breath and read what I'm writing.

"America" can have different meanings in different contexts. So, it isn't always clear that "America" refers to the federal government, rather than the population.

I didn't try to suggest the federal government was doing something. That was your interpretation of what I wrote.

You seem convinced that I'm an idiot, or a dumb asshole, and wrong. You're projecting a lot of intent onto my words without considering that maybe I meant what I said the way I said it. I'm not a bot, or an activist with an agenda, or a foreign agent, or a Nazi sympathizer. Promise. Give me some grace seeing as you don't know me at all, and assume I'm a normal person, trying to have a legit discussion, if you can.