this post was submitted on 08 Jun 2024
1 points (100.0% liked)

Lemmy.ca's Main Community

2814 readers
1 users here now

Welcome to lemmy.ca's c/main!

Since everyone on lemmy.ca gets subscribed here, this is the place to chat about the goings on at lemmy.ca, support-type items, suggestions, etc.

Announcements can be found at https://lemmy.ca/c/meta

For support related to this instance, use https://lemmy.ca/c/lemmy_ca_support

founded 3 years ago
MODERATORS
 

The sole moderator doesn't even follow their own rules: https://lemmy.ca/post/22741340?scrollToComments=true

I'll just say it - it's a Russian propaganda community. Is there any reason this community needs to exist on Lemmy.ca? Is there a rule against blatant astroturfing / propaganda / misinformation? I don't think the 5 rules in the sidebar are going to be enough to stop an army of trolls:

No bigotry - including racism, sexism, ableism, homophobia, transphobia, > or xenophobia. Be respectful. Everyone should feel welcome here. No porn. Use the NSFW tag when needed. No Ads / Spamming. Bot accounts need to be flagged as such in their settings.

Maybe time to get ahead of it?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 5 months ago (8 children)

Your comments about anti censorship in the past have helped me form my thoughts about what makes the most sense on Lemmy. Banning certain ideas here that aren't hateful or malicious is not a decision that should be taken lightly.

Reminding people that they can ban communities and users is a good idea.

I imagine there could be a pre-existing taxonomy for online moderation of what's misinformation vs low-quality posts vs hate speech that we might be able to use. If I find something useful, I'll share it.

I don't know if there's any value to this, but would a stickied thread about the community in question where people can hopefully describe their objections a little better help? Like, what beyond being pro-Russian and having low-quality information do you find problematic with the community? Is deception involved? If the content is just YT videos that others (ie, not the community mod) have created, I don't see that as deception. Even though I would block it, I wouldn't be in favour of banning a pro-Donald Trump community (as an example).

If 50, 75, or 90% of users on an instance block a community, would that hinder moderation to catch instances of misinformation that did pop up? (I'm just asking as some have raised that as a concern, and I don't know if that's legitimate or not)

[–] [email protected] 0 points 5 months ago (3 children)

Like, what beyond being pro-Russian and having low-quality information do you find problematic with the community?

All I'm going to say is: Does having this community/content around, on Lemmy.CA (the defacto Canadian instance), make this a better community? Is this going to attract the audience we want on the site? Is this the type of content we want to expose that audience to?

[–] [email protected] 0 points 5 months ago (1 children)

I don't think the lemmy.ca admins or most of it's users want the instance to take on the responsibility/experience of being an instance where there's a prescribed view of acceptable and unacceptable (banned) content, above and beyond objectively objectionable stuff. Curb appeal as an argument doesn't sway me. But if curb appeal or who we're attracting is a concern, I'd point out that most of the posts in that community are very downvoted, so to some extent Lemmy's existing checks and balances are working as intended to limit newcomers' exposure to a less popular community

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago)

I can respect this take. I do worry that burying problematic content isn't enough these days though. Even if only 2% of the visitors on this site see the content, all it takes is one person to believe there's a demonic child trafficking ring and then you have someone shooting up a pizza joint. Not everyone who uses the internet has all their faculties and I think that's an argument for going further than just burying the content. (I suspect we'll start seeing more pressure on YouTube and Facebook to go further than they have too with regards to problematic content like this.)

Edit: I also think that as platforms have become more strict about their community guidelines, the effectiveness of grand, overt disinformation campaigns has diminished, so bad actors' strategies are switching to more subtle, softer disinformation campaigns.

load more comments (1 replies)
load more comments (5 replies)