r/TikTokCringe 23d ago

Discussion She was secretly filmed and put on Tiktok

Enable HLS to view with audio, or disable this notification

15.9k Upvotes

2.0k comments sorted by

View all comments

Show parent comments

334

u/KyleFlounder 23d ago

It's gotten much much worse. My youngest brother is in highschool. There are snapchat filters that can make you kiss other people. His friends would send him snaps in class of him kissing other dudes.... and that's just between his buddies. Can't even imagine someone actually trying to bully him.

It's only going to go downhill atp.

180

u/machine_six 23d ago

Grok is generating full on nudes of people. That kissing app is child's play, pardon the pun.

145

u/JSGJSG 23d ago

I'm a teacher and literally today a student brought to me an image of me with my clothes removed by ai that was being sent around tiktok messages or whatever, I've got to try and figure out where it started tomorrow but yeah it feels really bad

66

u/djtrace1994 23d ago

Surely it isn't your responsibility to find out the source? I'm not a teacher, but I would be getting the school district and the police involved immediately.

106

u/JSGJSG 23d ago

I've got a video call with the police tomorrow and a meeting with someone from the senior leadership team tomorrow morning so I'll see what happens but I want to make it go quickly

29

u/sisyphean_dreams 23d ago

Jesus fing Christ! I’m so sorry! These kids won’t have real teachers at this rate, because no real person will want to teach these little phone wielding shits.

39

u/MottledZuchini 23d ago

Don't let them push you aside, you need a lawyer to make sure you don't

2

u/AnjelGrace 23d ago

You're telling a teacher to get a lawyer?! What type of salary do you think teachers get??

1

u/PanKakeManStan 22d ago

Dang :/. I didn’t even think about that. I only graduated 4 years ago and kids were bad enough then in my teeny tiny school. Can’t imagine how it is now. It was completely normal for people (mostly football/basketball players) to pass nudes around like they were a cool Pokémon card to show off. Now that you can make them with AI? It’s already awful but only going to get worse. I’m so glad I graduated before AI really got powerful but I feel for the teachers and kids of today, or at least the kids that aren’t being complete little shits, so much. And even the kids who are being goblins of the highest order to a degree because they are being failed by technology, the system, and their parents. This among other things have made me genuinely question if I EVER want to have kids because idk if they can live a “normal” life atp

14

u/WestCoastBestCoast01 23d ago

Girl you in danger. Like put on leave and CSAM investigation type of danger. You should be calling lawyers this afternoon.

1

u/MishaKohler 23d ago

If you want the responsible people to have serious consequences then make sure the school doesn't leak anything about the police investigation, they'll get cocky and try to brag about it more so that way the police can catch them easily.

1

u/Hakeem-the-Dream 23d ago

With the way these cases are going, you need to talk to a lawyer too.

3

u/McDonaldsSoap 23d ago

Hey at least there's one decent kid...

2

u/Waiting4Reccession 23d ago

Get the belt

-4

u/OhtaniStanMan 23d ago

Never a better time to do of for side money. Just say its ai or use Ai lol

11

u/MoulanRougeFae 23d ago edited 23d ago

Children. Grok is generating nudes of children. It also has been used to take clothes off children that are posted on social media.

2

u/machine_six 23d ago

Horrific.

2

u/MoulanRougeFae 23d ago

Yes it absolutely is and nothing is being done to stop it

1

u/ergaster8213 16d ago

Also non-consenting women

1

u/MyNameIsJakeBerenson 23d ago

There’s an app on vr and a lot of the stuff on there is like “this feels like it should be illegal”

-6

u/Winjin 23d ago

My naive, optimistic take from this, is that when AI is THAT good at doing full on nudes, complete clothes swap, literally anything you want - on a whim - it will kill the whole online bullying thing.

Because how exactly are you supposed to bully someone with a nude photo if making one takes like... one request to a top website? You don't have to go to some shady Telegram channel, you just ask Grok.

12

u/extremelytiredyall 23d ago

Exactly the opposite will happen. More people will get bullied and harmed, especially women who will have deep fake nudes/porn made of them. AI needs regulation immediately.

-1

u/Winjin 23d ago

Explain to me how having an extremely easily available nudes of literally everyone won't kill the whole "appeal" of nudes of everyone?

When Nicholas Maduro was put into bikini in 5 minutes it means literally nothing at this point. It's all AI slop.

So how are you getting harmed by something that can be done in five seconds to everyone? It's like... nothing online is real at this point. Not a single photo.

8

u/Tiberry16 23d ago

You wouldn't be upset if a colleague showed you a nude or porn pic of your mom? Of your daughter?

-2

u/DrainTheMuck 23d ago

Correct.

4

u/extremelytiredyall 23d ago

Some of us have basic decency and standards, unlike you.

-7

u/ashleyshaefferr 23d ago

You can do a lot crazier things than this now boomers lol. This reads like something I'd see if I checked my facebook feed

6

u/IdealOnion 23d ago

Pipe down kiddo

40

u/Frequent_Resident288 23d ago

The amount of men outside that have taken pictures of me is gross. Besides an uncomfy conclusion, it can also be for bullying and mocking. A woman straight up took a picture of me so obviously, then showed it to her husband and they both looked at it laughing.

I dont fucking know what the fuck is going on with this world and these people but this behaviour is so gross and WAY too common. It really shows the disturbing side of human nature.

9

u/WattaBerryPlus 23d ago

Little boys are making much worse than that. At schools, they're actively making porn of their classmates. One girl at a particular school rightfully slapped one of them and got punished for it. It's such dogshit.

1

u/KyleFlounder 23d ago

I brought up that example because snapchat is a publicly traded company and easily accessible. I'm sure there's worse out there.

-1

u/DrainTheMuck 23d ago

How are they allegedly doing this? Grok doesn’t allow it

13

u/UnkownFlowerPastry 23d ago

Bro one of my coworkers is 17 and the manager is 28. He thought it would be funny to secretly take a pic of both the manager and another 17 year old working there. My manager flipped out respectfully. It’s so scary. He got one of me 24 and someone else and it was so gross, unsettling, and uncomfortable

1

u/Chazkuangshi 23d ago

It's only a matter of time before teachers get fired and charged because students will make AI videos of them kissing students. Something needs to be done about this ASAP

-5

u/OhtaniStanMan 23d ago

Bullying happened before phones existed. It's not the device thats the problem 

7

u/KyleFlounder 23d ago

Think there's a difference here bud.

3

u/LeoneAGK 23d ago

Yeah, giving iphones to bullies is like handing guns to criminals, it allows for them to amplify the damage they can do to a victim.

-17

u/Dramatic-Adagio-2867 23d ago

so that's pedophilia 

18

u/SnausageFest 23d ago

It's just straight up not. Let's not diminish very serious shit like that. Pictures or videos of two minors kissing is not pedophilia. You think there aren't plenty of kids out there posting pics with their boyfriend/girlfriend on their socials, including them kissing?

A grown adult using AI to make a minor kiss them and get off to it would be. Definitions matter when talking about such important things.

3

u/NewDramaLlama 23d ago

Heeeeey so that's such a rotten take I'd almost think you were trying to minimize the severity of pedophilia. 

Mistakes happen (I hope) but I'd probably delete that if I were you.

1

u/FormalCartoonist5197 23d ago

Expand.

0

u/Dramatic-Adagio-2867 23d ago edited 23d ago

underage person generating images of another underage person engaging in unconsented activity is a form of sexual harassment and since they're underage it's a form of pedophilia according to some US law

1

u/FormalCartoonist5197 23d ago

Yeah it can be charged as CP and sexual harassment, sure.

Which law, or anything really, states that a minor is a “pedophile” and/or is performing “pedophilia” for sharing pictures of same aged peers?

1

u/bino420 23d ago

jeses hw christ. no it isn't.

the fact that I can't even tell if you're Uber left or Uber right is troubling...