this post was submitted on 07 Sep 2024
1 points (100.0% liked)

SneerClub

989 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (2 children)

there were bits and pieces that made me feel like Jon Evans was being a tad too sympathetic to Elizer and others whose track record really should warrant a somewhat greater degree of scepticism than he shows, but i had to tap out at this paragraph from chapter 6:

Scott Alexander is a Bay Area psychiatrist and a writer capable of absolutely magnificent, incisive, soulwrenching work ... with whom I often strongly disagree. Some of his arguments are truly illuminatory; some betray the intellectual side-stepping of a very smart person engaged in rationalization and/or unwillingness to accept the rest of the world will not adopt their worldview. (Many of his critics, unfortunately, are inferior writers who misunderstand his work, and furthermore suggest it’s written in bad faith, which I think is wholly incorrect.) But in fairness 90+% of humanity engages in such rationalization without even worrying about it. Alexander does, and challenges his own beliefs more than most.

the fact that Jon praises Scott's half-baked, anecdote-riddled, Red/Blue/Gray trichotomy as "incisive" (for playing the hits to his audience), and his appraisal of the meandering transhumanist non-sequitur reading of Allen Ginsberg's Howl as "soulwrenching" really threw me for a loop.

and then the later description of that ultimately rather banal New York Times piece as "long and bad" (a hilariously hypocritical set of adjectives for a self-proclaimed fan of some of Scott's work to use), and the slamming of Elizabeth Sandifer as being a "inferior writer who misunderstands Scott's work", for uh, correctly analyzing Scott's tendencies to espouse and enable white supremacist and sexist rhetoric... yeah it pretty much tanks my ability to take what Jon is writing at face value.

i don't get how after so many words being gentle but firm about Elizer's (lack of) accomplishments does he put out such a full-throated defense of Scott Alexander (and the subsequent smearing of his """enemies"""). of all people, why him?

[–] [email protected] 0 points 2 months ago

Meditations on Moloch is “soul-wrenching”, apparently. Jesus fucking Christ.

In what world do these people grow up? “Oh my God, conflict exists between interests and values, things are hard, not every problem is tractable”.

There used to be a refrain that “Moloch” is effectively Siskind’s word for capitalism, because he can’t bring his libertarian heart to name what everybody understands. But that’s wrong, because Siskind’s view is no more than the shallowest Burkeanism. And the worst thing about every single anti-Utopian is that they all assume everybody else feels as mugged by imperfection as they do.

[–] [email protected] 0 points 2 months ago

"Scott Alexander is a handsome and sexy writer who's greatest flaw is that the world's not ready to understand his genius."

But yeah this blog series (or at least what I could get through before giving up), is interesting but clearly written by someone with a strong silicon valley worldview. Burning Man and AI generated header images and all.

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago)

I never hated my parents. They said they loved me, and the cultural wisdom of the science-fictional literature said your parents genuinely love you no matter what you disagree about

"it is said that learning from others is wisdom. but those piddly people around you, pah, what could they know? I shall instead go right to the source, the many works of ~~entertainment~~ ~~idea-exploration~~ ~~drug binge output~~ fine, detailed, highly-accurate scifi literature that man has produced in checks notes the two or three decades that have preceded my birth. for I am yud, and I am very clever!"

[–] [email protected] 0 points 2 months ago (1 children)

... "Coming of Age" also, oddly, describes another form of novel cognitive dissonance; encountering people who did not think Eliezer was the most intelligent person they had ever met, and then, more shocking yet, personally encountering people who seemed possibly more intelligent than himself.

The latter link is to "Competent Elities", a.k.a., "Yud fails to recognize that cocaine is a helluva drug".

I've met Jurvetson a few times. After the first I texted a friend: “Every other time I’ve met a VC I walked away thinking ‘Wow, I and all my friends are smarter than you.’ This time it was ‘Wow, you are smarter than me and all my friends.’“

Uh-huh.

Quick, to the Bat-Wikipedia:

On November 13, 2017, Jurvetson stepped down from his role at DFJ Venture Capital in addition to taking leave from the boards of SpaceX and Tesla following an internal DFJ investigation into allegations of sexual harassment.

Not smart enough to keep his dick in his pants, apparently.

Then, from 2006 to 2009, in what can be interpreted as an attempt to discover how his younger self made such a terrible mistake, and to avoid doing so again, Eliezer writes the 600,000 words of his Sequences, by blogging “almost daily, on the subjects of epistemology, language, cognitive biases, decision-making, quantum mechanics, metaethics, and artificial intelligence”

Or, in short, cult shit.

Between his Sequences and his Harry Potter fanfic, come 2015, Eliezer had promulgated his personal framework of rational thought — which was, as he put it, “about forming true beliefs and making decisions that help you win” — with extraordinary success. All the pieces seemed in place to foster a cohort of bright people who would overcome their unconscious biases, adjust their mindsets to consistently distinguish truth from falseness, and become effective thinkers who could build a better world ... and maybe save it from the scourge of runaway AI.

Which is why what happened next, explored in tomorrow’s chapter — the demons, the cults, the hells, the suicides — was, and is, so shocking.

Or not. See above, RE: cult shit.

[–] [email protected] 0 points 2 months ago

Reading about the hubris of young Yud is a bit sad, a proper Tragedy. Then I have to remind myself that he remains a manipulator, and that he should be old enough to stop believe—and promote—in magical thinking.

[–] [email protected] 0 points 2 months ago