Hello World,
following feedback we have received in the last few days, both from users and moderators, we are making some changes to clarify our ToS.
Before we get to the changes, we want to remind everyone that we are not a (US) free speech instance. We are not located in US, which means different laws apply. As written in our ToS, we're primarily subject to Dutch, Finnish and German laws. Additionally, it is our discretion to further limit discussion that we don't consider tolerable. There are plenty other websites out there hosted in US and promoting free speech on their platform. You should be aware that even free speech in US does not cover true threats of violence.
Having said that, we have seen a lot of comments removed referring to our ToS, which were not explicitly intended to be covered by our ToS. After discussion with some of our moderators we have determined there to be both an issue with the ambiguity of our ToS to some extent, but also lack of clarity on what we expect from our moderators.
We want to clarify that, when moderators believe certain parts of our ToS do not appropriately cover a specific situation, they are welcome to bring these issues up with our admin team for review, escalating the issue without taking action themselves when in doubt. We also allow for moderator discretion in a lot of cases, as we generally don't review each individual report or moderator action unless they're specifically brought to admin attention. This also means that content that may be permitted by ToS can at the same time be violating community rules and therefore result in moderator action. We have added a new section to our ToS to clarify what we expect from moderators.
We are generally aiming to avoid content organizing, glorifying or suggesting to harm people or animals, but we are limiting the scope of our ToS to build the minimum framework inside which we all can have discussions, leaving a broader area for moderators to decide what is and isn't allowed in the communities they oversee. We trust the moderators judgement and in cases where we see a gross disagreement between moderatos and admins' criteria we can have a conversation and reach an agreement, as in many cases the decision is case-specific and context matters.
We have previously asked moderators to remove content relating to jury nullification when this was suggested in context of murder or other violent crimes. Following a discussion in our team we want to clarify that we are no longer requesting moderators to remove content relating to jury nullification in the context of violent crimes when the crime in question already happened. We will still consider suggestions of jury nullification for crimes that have not (yet) happened as advocation for violence, which is violating our terms of service.
As always, if you stumble across content that appears to be violating our site or community rules, please use Lemmys report functionality. Especially when threads are very active, moderators will not be able to go through every single comment for review. Reporting content and providing accurate reasons for reports will help moderators deal with problematic content in a reasonable amount of time.
People in the US have justifiable revulsion to its rapacious healthcare system leading to outright un-aliving of a large segment of the population. One might argue that it's a silent genocide of the underprivileged. This incident has highlighted that sentiment in a way that may effect real change and in a way his untimely demise may lead to positive health outcomes. Suppressing the expression of that anger could have the opposite outcome.
Okay, but to be clear: the admins panicked against the very real possibility of police shutting them down, and took a moment to make certain that Lemmy.World can continue to exist to serve this or any needs of people across the world. The bans have already expired, the mod who did it apologized and said that no new ones would be forthcoming, the ToS have been clarified, etc. Yes there was "suppression", but for like 1-3 days, and it's already over?
Unless you mean that people should be free to advocate for future murders, and I would argue that there are other (e.g. anarchy) instances for that, but Lemmy.World is free to do as they please, and to restrict such on their own hardware.
I am saying that "suppression" seems too harsh a word here imho, when the ToS now clearly delineates the line between what is or is not allowed on the LW instance and thereby communities located on it. Isn't that a success then, to define the parameters within which the instance is allowed to discuss these matters (again, by police, a very real external factor that definitely truly does exist - and can come down HARD on those who would want to FAAFO), so kinda the opposite of "suppression" then? Well, according to some manner of using the term at any rate - not everything is allowed, but definitely not nothing along these lines is either.
Like, surely you've seen the veritable FLOOD of posts and comments in just about every community imaginable across the entire Fediverse lately, alternately either promoting this guy as a hero or decrying him as a terrorist? That's not "suppression" in my book - again, the jury nullification matter was, days ago, but that's already over? The nuances here are important, b/c we cannot have true freedom without being responsible to keep this place alive & kicking & not shut down.
Has this "very real possibility" happened to any social media platform in America?
(1) irrelevant, bc I was discussing their fears, but really the admins can do whatever they wish, at any time, for any reason. We are free to cry about it, or leave, but it's theirs to do with whatever they will.
(2) Fox News in the USA had to pay nearly a billion dollar settlement for their misinformation. OAN was shut down. More importantly, CSAM exists and people fear to have it on their servers, and regardless of the direct ethical implications there are some extremely strong legal ones as well. Piracy websites likewise get taken down constantly, even for "only" sharing links to where content is hosted "elsewhere". I am sure that if someone legitimately wanted to know the direct answer to your question (not me) they could perhaps spend some time searching for the answer?
But again, aside from you changing the subject of this conversation, it's irrelevant in the first place, bc the fear itself is real.
This is not TikTok, btw.
Youβre not wrong it being a genocide of undesirables.
Please don't give centrists another reason to root for CEO's.
Fucking burnnnnn