773
submitted 1 year ago* (last edited 1 year ago) by ada to c/main

Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don't agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they're not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I'm sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn't a line I'm willing to walk. I have defederated lemmynsfw and won't be reinstating it whilst that community is active.

you are viewing a single comment's thread
view the rest of the comments
[-] kardum 37 points 1 year ago* (last edited 1 year ago)

i had no problem distinguishing the models on the community from children.

maybe it's more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that's just human anatomy.

that's why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.

i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn't recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.

i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that's kind of cool too for sure.

[-] ada 17 points 1 year ago* (last edited 1 year ago)

i had no problem distinguishing the models on the community from children.

You didn't see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.

maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

Again, a group that is focused on models in which that is the only way you can tell that they're not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.

Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM

[-] kardum 17 points 1 year ago

I thought about this some more and I can feel a lot more sympathy for your decision now.

It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.

Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.

I'm sorry for coming across as ignorant, I just did not consider your perspective that much really.

[-] NuMetalAlchemist 23 points 1 year ago

"Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity."

Then they shouldn't be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn't need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin' bullshit. Seeing my trauma being equated to drawn pictures is fuckin' bullshit. My trauma being equated to AI generated shit is fuckin' bullshit. I'll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I'll fuckin bite anyone that tries to take away my free agency again.

[-] ada 8 points 1 year ago

I myself am a CSA survivor

FYI, so am I

[-] NuMetalAlchemist 17 points 1 year ago

Cool, welcome to the real world where one size does not fit all. We handle our trauma differently. But I don't subject others to my hangups. I don't use it as a cudgel to squash dissent. Your trauma is not your fault, but it is your responsibility, not ours, to deal with.

[-] ada 5 points 1 year ago
[-] NuMetalAlchemist 14 points 1 year ago

AKA you couldn't think of a response that didn't make you sound hateful. Look, I don't have anything against you personally, Ada. We probably agree on 99.9% of shit. But you are definitely not well suited to admin. And now all the trolls on the fediverse know exactly what legal content to spam your inbox with to make you uncomfortable. Emotional moderators make for short-lived communities.

[-] ada 8 points 1 year ago

I've been moderating and community building for literal decades. I think I'll be ok

[-] NuMetalAlchemist 7 points 1 year ago

Well, I'll be here watching the flames rise.

load more comments (8 replies)
load more comments (15 replies)
load more comments (15 replies)
this post was submitted on 23 Jul 2023
773 points (100.0% liked)

Blahaj Lemmy Meta

2353 readers
1 users here now

Blåhaj Lemmy is a Lemmy instance attached to blahaj.zone. This is a group for questions or discussions relevant to either instance.

founded 2 years ago
MODERATORS