this post was submitted on 23 Jul 2023
4 points (100.0% liked)

Blahaj Lemmy Meta

2276 readers
3 users here now

Blåhaj Lemmy is a Lemmy instance attached to blahaj.zone. This is a group for questions or discussions relevant to either instance.

founded 2 years ago
MODERATORS
 

Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don't agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they're not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I'm sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn't a line I'm willing to walk. I have defederated lemmynsfw and won't be reinstating it whilst that community is active.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

the same community (adorableporn) is also on reddit btw with 2.2m subscribers.

i have no grand moral opinion on this type of content. for me it is the same as femboy content for example, where people also push for a youthful, girly aesthetic.

as long as the content is made by consenting verified adults, i don't care.

it's like adults cosplaying with japanese school uniforms or calling your partner "mommy" or "daddy".

probably not the best move in terms of sexual morals for sure, in the grand scheme of things tho this is just how people express their sexuality i guess.

[–] [email protected] 0 points 1 year ago (1 children)

it’s like adults cosplaying with japanese school uniforms or calling your partner “mommy” or “daddy”.

No, it's not, because no one mistakes those things for actual underage children

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

i had no problem distinguishing the models on the community from children.

maybe it's more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that's just human anatomy.

that's why the guy at the gas station asks for my ID card, because it is not always super clear. but apparently clear enough for reddit admins and PR people from ad companies.

i agree playing into the innocent baby aspect is probably not great for sexual morals and i wouldn't recommend this comm to a local priest or a nun, but this type of content thrives on pretty much every mainstream platform in some shape or form.

i get it, if this instance wants to be sexually pure and removed from evil carnal desires tho. that's kind of cool too for sure.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (2 children)

i had no problem distinguishing the models on the community from children.

You didn't see the content I saw. Content that was reported as CSAM by someone on this instance, who also thought it was CSAM.

maybe it’s more difficult in some cases without looking for the onlyfans link or sth similar of the model somewhere in the post, but that’s just human anatomy.

Again, a group that is focused on models in which that is the only way you can tell that they're not underage, is a group that is focused on appealing to people who want underage models. That is a hard no.

Spin it how you like, but I am not going to be allowing material that is easily mistaken from CSAM

[–] [email protected] 0 points 1 year ago (2 children)

I thought about this some more and I can feel a lot more sympathy for your decision now.

It must be horrible to get a user report about CSAM and then see a picture, which could be really CSAM on first glance.

Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity.

It is great if admins from other instances are willing to handle with these horror reports, just to give their users a bigger platform, but this service is not something that can be taken for granted.

I'm sorry for coming across as ignorant, I just did not consider your perspective that much really.

[–] [email protected] 0 points 1 year ago (1 children)

I totally get that and definitely don't blame Ada for defederating (although I don't think it's likely it was actually CSAM, nor that the community it was on is Inherently Problematic, as long as everyone in the posts is 18+, people's kinks are none of my business).

The thing I don't get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all. That seems like a design flaw in Lemmy, instance mods have no power to moderate content on off-instance communities, so why would they be notified of reports? That seems like it would clutter mod-logs for no reason and cause unnecessary drama (as happened here). Like if every subreddit post report immediately went to the Site Admins, that would be Terrible.

Though if Lemmy really is built like this for whatever reason, I would probably have done the same thing. I wouldn't want to have to be Subjected to everything that could be reported on an NSFW instance, there's probably some Heinous Shit that gets posted at least Occasionally, and I wouldn't want to see all of it either. I just think it's Really Stupid that lemmy is built this way, we need better moderation tools

[–] [email protected] 1 points 1 year ago

The thing I don’t get is why reports made by blahaj users on lemmynsfw communities would go to the blahaj moderators at all.

Reports go to the admins on the instance the reporter is from, to the admins on the instance the reported account is from and to the admins of the instance the community the post was made to is from. The report also goes to the moderators of the community that the content was posted to.

Each instance only gets a single report, however many of those boxes it ticks, and that report can be dealt with by admins or moderators.

However, the results federate differently based on who does the action. So for example, me deleting content from a lemmynsfw community doesn't federate. It just removes it from my instance. However, a moderator or an admin from lemmynsfw removing lemmynsfw content will federate out.

[–] [email protected] 1 points 1 year ago (1 children)

"Even if every user report was wrong from now until infinity, that initial CSAM suspicion, because of the false user report, probably makes moderating just a soul-crushing activity."

Then they shouldn't be doing it. If seeing something that looks even slightly off-putting causes this level of over-reaction, Ada doesn't need to be moderating a community for marginalized/at-risk people. I myself am a CSA survivor, and seeing my trauma being equated to some legal adults playing pretend is fuckin' bullshit. Seeing my trauma being equated to drawn pictures is fuckin' bullshit. My trauma being equated to AI generated shit is fuckin' bullshit. I'll tell you one thing, as a CSA kid, one thing I cannot stand is someone making decisions on my behalf. To protect me. Fuck you, I'll fuckin bite anyone that tries to take away my free agency again.

[–] [email protected] 0 points 1 year ago (1 children)

I myself am a CSA survivor

FYI, so am I

[–] [email protected] 2 points 1 year ago (1 children)

Cool, welcome to the real world where one size does not fit all. We handle our trauma differently. But I don't subject others to my hangups. I don't use it as a cudgel to squash dissent. Your trauma is not your fault, but it is your responsibility, not ours, to deal with.

[–] [email protected] 0 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago (1 children)

AKA you couldn't think of a response that didn't make you sound hateful. Look, I don't have anything against you personally, Ada. We probably agree on 99.9% of shit. But you are definitely not well suited to admin. And now all the trolls on the fediverse know exactly what legal content to spam your inbox with to make you uncomfortable. Emotional moderators make for short-lived communities.

[–] [email protected] 0 points 1 year ago (1 children)

I've been moderating and community building for literal decades. I think I'll be ok

[–] [email protected] 1 points 1 year ago

Well, I'll be here watching the flames rise.

[–] [email protected] 1 points 1 year ago (1 children)

Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I'm not on platforms where these are popular)

Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.

That's why I didn't use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.

I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.

Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.

[–] [email protected] 0 points 1 year ago (1 children)

Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.

If I can't tell, if I have to look something up because the people I'm looking at look like they're underage, then it doesn't matter what the answer is, because the issue is that it looks like CSAM even if it's not. And a community designed in a way that attracts people looking for underage content is not a space I'm willing to federate with.

[–] [email protected] 1 points 1 year ago (1 children)

Isn't it kind of shitty to tell an adult woman she can never be attractive or sexy because she looks too young? Do you truly believe that said person should never be allowed to find love, because it's creepy? Is she supposed to just give up because you think her body is icky?

[–] [email protected] 0 points 1 year ago (1 children)

I've covered this many times already.

The issue isn't individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.

The community that initiated this isn't even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.

[–] [email protected] 1 points 1 year ago

So we can rely on you to ban any twink community on this instance, right? Cause the whole idea behind twinks is looking smooth, young, and pubescent. So it is a community that glorifies boys that look underage. You feelin icky about that one? Or is that "different"

[–] [email protected] 0 points 1 year ago (1 children)

I feel like the people getting upset over this are taking these hypotheticals of "young looking adults just wanting to be able to make porn equally and that technically the community did nothing wrong".

The problem is that just ignores the fact that pedophiles would definitely use communities like that as a "foot in the door" to a comminity that would naturally have a lot of closetted pedophiles. The issue isn't young looking adults making porn, the issue is a community based around youngest possible looking adults is naturally gonna attract and encourage pedophiles.

It's like they say, "all it takes is allowing one nazi in your bar for it to rapidly turn into a nazi bar".

[–] [email protected] 1 points 1 year ago

I mean yeah, but I think the solution here is just age verification. If you're posting nsfw OC, you should have to verify age with mods, and if you're posting nsfw from online, you should be able to prove they're of age if prompted (like, if it's a famous pornstar, they should be verified on pornhub or onlyfans or something so it's easy to check whether they're of age).

Like, I have small tits, I'd like to be able to post nsfw without people insinuating I'm pedo-baiting or that people attracted to me are intrinsically pedophilic. Just have strictly enforced age-gates and ban anyone being creepy

[–] [email protected] 2 points 1 year ago (2 children)

I guess Trans Littles can just go fuck off then? One of the biggest Trans comics artist is openly a little. Why are we in the business of regulating what consenting adults do?

[–] [email protected] 0 points 1 year ago (1 children)

No one is looking at a little and thinking that they're physically 15.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I wrote a comment but got more aggressive than I intended. My overall point though is there are young looking adults, there are old looking kids. Making a sweeping statement like you did is just wrong

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Young looking adults also aren't the issue.

The issue is a community that focuses heavily on models that are framed to look like they're not adults.

Not adults roleplaying. Not adults that incidentally happen to look younger than they are.

[–] [email protected] 1 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago (1 children)

Again, the issue is a community with models that are framed to look like they're not adults.

There is no scenario where something that can be mistaken for CSAM will have a space here.

[–] [email protected] 1 points 1 year ago

And again, these are adults on an instance that was explicitly designated for NSFW works. Defederating was entirely within your right but these justifications seem really poorly thought out, and could have unintended consequences.

Should we shun non consensual play? Should we defederate from anything that shows BDSM? Because I can't see any reason why your justifications wouldn't apply to them

[–] [email protected] 0 points 1 year ago (1 children)

Don’t be disingenuous. Genuine consent practices also consider that not everyone else consents to witnessing their play, so they don’t do it where it’s not welcomed. And it’s not welcomed on Blahaj Zone, in this case. That’s all.

[–] [email protected] 1 points 1 year ago

Exscuse me but you're the one being disengenious, a NSFW instance had what!? Porn!? Stop the fucking presses. Are we going to defederate from all porn instances or just the ones you find icky? Where can I post my objection to having to be subjected to porn at all?

[–] [email protected] 0 points 1 year ago (1 children)

I am very disheartened by the number of people replying here who read “a community skirting the line trying to look like CSAM” and felt the need to go purposefully seek out that community to look through its images.

[–] [email protected] 1 points 1 year ago

Probably because the community in question isn't trying to "skirt the line" and just posts popular pornstars that range from 18 to the mid twenties. I thought it was a kink community until someone finally linked the lemmynsfw post and it's actually just a community for cute pornstars.

Calling it CSAM-adjacent just means that nobody's comfortable actually looking at it to figure out what's going on, and hugely exaggerated.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (2 children)

I think both instance admins have a valid stance on the matter. lemmynsfw appears to take reports very seriously and if necessary does age verification of questionable posts, something that likely takes a lot of time and effort. Blahaj Lemmy doesn't like the idea of a community that's dedicated to "adults that look or dress child-like". While I understand the immediate (and perhaps somewhat reactionary) concern that might raise, is this concern based in fact, or in emotion?

Personally I'm in the camp of "let consenting adults do adult things", whether that involves fetishes that are typically thought of as gross, dressing up in clothes or doing activities typically associated with younger ages, or simply having a body that appears underage to the average viewer. As the lemmynsfw admin mentioned, such persons have the right to lust and be lusted after, too. That's why, as a society, we decided to draw the line at 18 years old, right?

I believe the concern is not that such content is not supposed to exist or be shared, but rather that it's collected within a community. And I think the assumption here is that it makes it easy for "certain people" to find this content. But if it is in fact legal, and well moderated, then is there a problem? I don't believe there is evidence that seeing such content could change your sexual preferences. On the other hand, saying such communities should not exist could send the wrong message, along the lines of "this is weird and should not exist", which might be what was meant with "body shaming".

I'm trying to make sense of the situation here and possibly try to deescalate things, as I do believe lemmynsfw approach to moderation otherwise appears to be very much compatible with Blahaj Lemmy. Is there a potential future where this decision is reconsidered? Would there be some sort of middle-ground that admins from both instances could meet and come to an understanding?

[–] [email protected] 0 points 1 year ago (2 children)

is this concern based in fact, or emotion?

Ada was clear in another comment thread that yes, emotion was absolutely involved in her decision. That isn’t a bad thing. Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?

Personally I’m in the camp of “let consenting adults do adult things”

Me too. I don’t think anyone is arguing against that. Anyone can still access LemmyNSFW’s content elsewhere, Blahaj Zone simply isn’t going to relay it anymore because some of it is incompatible with Ada’s goals in nurturing this community.

But if it is in fact legal, and well moderated, then is there a problem?

Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.

[–] [email protected] 1 points 1 year ago (1 children)

The reason I brought up emotion in my reply was because I've felt that the lemmynsfw admins have been able to explain their decision quite reasonably and seemed to be open to conversation, wheras Ada was set on one goal and upon finding disagreement, wasn't in the right mindset to continue a constructive conversation. Which, to be fair, due to the nature of the content, is understandable.

If the content that the Blahaj Lemmy admins are concerned about are limited to certain communities, and part of the issue is the concentration of content in said communities in the first place (at least, as I speculated in my original reply), then I don't quite understand why blocking these communities only isn't something that was considered, rather than defederating the entire instance. I do respect Blahaj Lemmy's decision not to want to host such content. Or is there some technical limitation that I'm not aware of?

[–] [email protected] 1 points 1 year ago (1 children)

I don’t quite understand why blocking these communities only isn’t something that was considered, rather than defederating the entire instance

Because I am not ok federating with a space that is ok with content that looks like CSAM. "It's technically legal" isn't sufficient in this case.

[–] [email protected] 2 points 1 year ago (1 children)

But whether it's technically legal is exactly what does or doesn't make it CSAM. "Looking like" is going to be highly subjective, and I don't understand how the admins of the other instance are supposed to handle reports, other than to verify whether or not it actually is the case or not.

Are petite looking people not supposed to make explicit content while dressing up cute? Should a trans man not share explicit pictures of himself, because he might look like an underage boy? Do we stop at porn that gives the appearance of someone being young? What about incest or ageplay? Like, what if you or someone else was made sufficiently uncomfortable by some other kind of porn? How do you decide what is and isn't okay? How do you avoid bias? What would you be telling a model when they ask why you removed their content?

Apologies for going on with this when I'm sure you're already sick of dealing with this. I had just felt like some of the points I brought up (like in my original reply) were entirely overlooked. Putting effort into an (attempted) thought-out reply doesn't mean I get to receive a response I was hoping for, but I was at least hoping for something you hadn't already said elsewhere.

[–] [email protected] 1 points 1 year ago

but I was at least hoping for something you hadn’t already said elsewhere.

There is no more to this. I don't have a list of endless reasons.

The reason is that it looks like CSAM and appeals to folk looking for CSAM. I'm a CSA survivor myself. A space that appeals to folk looking for CSAM isn't a community that I'm willing to share space with.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (1 children)

Why is there a social attitude that decision-making is only valid if it’s cold and unfeeling?

Probably because everyone agrees that we don't make the best decisions when emotional? In fact we tend to make our worst decisions when emotional? There's a pretty significant difference between society judging people for being emotional, and society disapproving of emotional decisions. Because people making significant choices when they aren't thinking clearly is pretty obviously a bad idea.

Yes. Legality has nothing to do with acceptability. This instance already bans lots of content that doesn’t actually violate any laws. It’s a judgment call.

And yet teen porn is one of the most popular categories around. This sounds like a subcategory confined to a single community, and precisely what the block function is for. There's a pretty big difference between Exploding Heads and a single disliked community.

Edit: After finally seeing a link to the lemmynsfw discussion, it's not a kink community or anything fringe. It's literally a community around cute pornstars.

[–] [email protected] 1 points 1 year ago (1 children)

Yeah, see, it’s that conflation of “emotional” and “not thinking clearly” that bothers me. Those aren’t the same thing, despite the dominant cultural narrative to the contrary. Sometimes they go together, sometimes they don’t.

[–] [email protected] 1 points 1 year ago

Are they not..? I mean, thinking clearly and intense emotions genuinely don't go together. Crimes of passion, riots after sports games, getting "carried away" in the heat of the moment. Temporary insanity being an actual legal defense.

There's a reason that a lot of good advice when handling intense emotions is all about taking a minute to step back and breath, clarify what you're feeling, accept it, and then express it safely. There's nothing wrong with being emotional, but arguing that there's nothing wrong with making decisions while emotionally charged is just a really not good idea. The fact that the acronym for managing intense emotions is STOPP should be a bit telling.

[–] [email protected] 1 points 1 year ago (1 children)

Reminds me of a lot of the debates around kink at pride/ddlg kink stuff. The latter is really not my thing and makes me uncomfortable, but I recognise that that's a personal thing between me and my partners that I can't, and shouldn't, police among others.

There's also ethical debates to be had on porn in places like Lemmy/pornhub/etc. -- we can't know that the person has consented to being posted, or that they have recourse to get it taken down and stop it being spreaded if they do not.

Then there's the realpolitik of, regardless of ethics, whether it's better to have porn of this type in visible, well moderated communities, or whether it's better to try to close off ethically dubious posting.

It's one I don't really have squared off in my head quite yet. Similarly with kink at pride; I've read about the historic importance of kinksters and recognise that, but at the same time I want there to be a space where queer kids can be involved with pride without being exposed to kink. Is that just prudish social norms talking? Idk; I'm still working it through.

[–] [email protected] 1 points 1 year ago

For what it's worth, I feel like while society has become more socially accepting of people being different (imperfectly, but we have), at least in the US we've become more and more prudish when it comes to sex itself. Part of the changing era has led to a reduction in exploitation and things that were generally viewed as sketchy, but not all that big of deal (kids inheriting porn mags, sexual harassment, imbalances in power), where now sketchy behavior is quickly called out.

That said, I feel like a lot of hard conversations have been completely avoided because they'd be awkward and uncomfortable and instead we just pretend they aren't there.

Like in theory, anyone under 18 in the US can't legally see so much as a titty (unless it's art), read sexually explicit material, or see a movie or tv show with explicit content. And then, literally nobody wants to talk to teenagers about sex. I watched a reddit thread eat itself alive because a dad was furious that his wife had bought their daughter a dildo after he had confiscated her laptop when catching her looking at them and asked his wife to deal with it. People were calling for her to be reported for sexual abuse, while actual women were being attacked for sharing their own experiences as teens. Things just seem a little crazy.

People are so uncomfortable with the concept that they want to disappear anything that reminds them that 18 isn't actually a magical division between childhood and adulthood. And then you have this thread, where lemmynsfw was banned because a community sharing "cute" pornstars was a step too far despite being actual professional adults. Idk, it seems exactly like Australia's whole thing where they started banning pornstars in their late twenties because they have small tits as part of a project to "fight" child porn.

load more comments
view more: next ›