this post was submitted on 01 Feb 2025
1 points (100.0% liked)

SneerClub

1041 readers
4 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS
 

a place for it

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 6 days ago (1 children)
[–] [email protected] 0 points 5 days ago* (last edited 5 days ago)

fuck man, this was bad enough that people outside the sneerverse were talking about this around me irl

[–] [email protected] 0 points 2 weeks ago
[–] [email protected] 0 points 2 weeks ago (2 children)
[–] [email protected] 0 points 1 week ago

Reached by the Chronicle, Yudkowsky said he would not read the letter for the same reason he refused to read the manuscript of the Unabomber, Ted Kaczynski.

Not even the MST3k version?

https://groups.google.com/g/alt.tv.mst3k/search?q=MiSTing%3A+Unabomber+Manifesto

[–] [email protected] 0 points 1 week ago (1 children)

Well now I am screaming about this. I am trying very, very hard to do it in a way that you won’t immediately dismiss - Audere

I'm not gonna read that - EY

also lol the amount of rat sanewashing

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (2 children)

They didn't care much for marine life when they abandoned a tugboat, leavnig it to leak fuel and crap into the sea:

https://archive.ph/1UcWZ

[–] [email protected] 0 points 1 week ago

lol they wanted to grow algae or bacteria for food while their living quarters were slightly dirtier than average frat house. that would go well

[–] [email protected] 0 points 1 week ago

how green of them

[–] [email protected] 0 points 2 weeks ago (1 children)

Almost nostalgic to see a TREACLES sect still deferring to Eliezer's Testament. For the past couple of years the Ratheology of old with the XK-class end of the world events and alignof AI has been sadly^1^ sidelined by the even worse phrenology and nrx crap. If not for the murder cults and sex crimes, I'd prefer the nerds reinventing Pascal's Wager over the JAQoff lanyard nazis^2^.

1: And it being sad is in and of itself sad.

2: A subspecies of the tie nazi, adapted to the environmental niche of technology industry work

[–] [email protected] 0 points 2 weeks ago (1 children)

A subspecies of the tie nazi

OBJECTION! Lanyard nazis include many a shove-in-a-locker nazi

[–] [email protected] 0 points 2 weeks ago

Counter-objection: so do all species of the nazi genus.

[–] [email protected] 0 points 3 weeks ago (1 children)
[–] [email protected] 0 points 2 weeks ago

Thanks for the link.

I forgot how frustrating these people are. I'd love to read these comments but they're filled with sentences like:

I take seriously radical animal-suffering-is-bad-ism[1], but we would only save a small portion of animals by trading ourselves off 1-for-1 against animal eaters, and just convincing one of them to go vegan would prevent at least as many torturous animal lives in expectation, while being legal.

Just say "I think persuading people to become vegan is better than killing them"?

Why do you need to put a little footnote[1] to some literal fiction someone wrote about human suffering to make a point?

Screw it, here's that footnote:

For a valid analogy between how bad this is in my morality and something that would be equally bad in a human-focused morality, you can imagine being born into a world with widespread human factory farms. Or the slaughter and slavery of human-like orcs, in case of this EY fiction [link omitted].

Exqueeze me? You have to resort to some shit somebody made up to talk about human exploitation?

[–] [email protected] 0 points 3 weeks ago (1 children)

Lots of discussion on the orange site post about this today.

(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)

[–] [email protected] 0 points 3 weeks ago

came here to post this!

I loved this comment:

=====

[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]

The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

As relevant here:

  1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...

  2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

  3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.

  4. The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

  5. It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Dozens of debunkings! We don't need citations when we have Bayes' theorem!!

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

To be fair, I also dont believe any stories about a blackmail pedo ring ran by Yudkowsky. And with all the bad stuff going round, why make up a blackmail ring, like he is the Epstein of Rationality. (Note this is a diff thing than: people did statutory rape and it was covered up, which considering the Dill thing is a bit more believable).

[–] [email protected] 0 points 3 weeks ago

Yes, I agree.

[–] [email protected] 0 points 3 weeks ago

Stop stop I can only update my priors so fast!

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)
[–] [email protected] 0 points 3 weeks ago (1 children)

As for privacy well, Brent Dill has been running around UFO discords telling people he’s in hiding because the rats want to kill him. Make of that what you will.

[–] [email protected] 0 points 3 weeks ago (1 children)
[–] [email protected] 0 points 3 weeks ago

it was how I found his twitter actually!

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago)

Ziz was originally radicalised by the sex abuse and abusers around CFAR

deleted comments from the thread https://www.lesswrong.com/posts/96N8BT9tJvybLbn5z/we-run-the-center-for-applied-rationality-ama

[–] [email protected] 0 points 3 weeks ago (1 children)
[–] [email protected] 0 points 3 weeks ago

College Hill is very good and on top of this shit

load more comments
view more: next ›