this post was submitted on 03 Jun 2024
1 points (100.0% liked)

TechTakes

1432 readers
16 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 5 months ago

just read this https://www.wheresyoured.at/were-watching-facebook-die/ and realized that zucc burned on metaverse enough money to push 15 or so new pharmaceuticals from bench to market

behold, effective allocation of resources under capitalism

[–] [email protected] 0 points 5 months ago (3 children)

Vitalik Buterin:

A few months ago I was looking into Lojban and trying to figure out how I would translate "charge" (as in, "my laptop is charging") and the best I could come up with is "pinxe lo dikca" ("drink electricity")

So... if you think LLMs don't drink, that's your imagination, not mine.

My parents said that the car was "thirsty" if the gas tank was nearly empty, therefore gas cars are sentient and electric vehicles are murder, checkmate atheists

That was in the replies to this, which Yud retweeted:

Hats off to Isaac Asimov for correctly predicting exactly this 75 years ago in I, Robot: Some people won't accept anything that doesn't eat, drink, and eventually die as being sentient.

Um, well, actually, mortality was a precondition of humanity, not of sentience, and that was in "The Bicentennial Man", not I, Robot.

In the I, Robot story collection, Stephen Byerley eats, drinks and dies, and none of this is proof that he was human and not a robot.

[–] [email protected] 0 points 5 months ago

Why are techbros such shit at Lojban? It's a recurring and silly pattern. Two minutes with a dictionary tells me that there is {seldikca} for being charged like a capacitor and {nenzengau} for charging like a rechargeable battery.

[–] [email protected] 0 points 5 months ago (5 children)
[–] [email protected] 0 points 5 months ago

it is true that he would vanquish Derek Guy by just sucking so much

[–] [email protected] 0 points 5 months ago

🙃

Not a cult just a following. Like Andrew Tate for nerds

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago)

Nobody drills down to the heart of the matter, why would this prove you are not a cult leader, and what does this say about somebody like Charles Manson, who also dressed funny and sometimes was disrespected by his cult members (he was anti drugs, and his followers well yeah, not so much) does this make him also not a cult leader?

Lol at the 'where can I join your cult?' in the replies. Also literally text written on the walls, also not very crazyperson style look.

[–] [email protected] 0 points 5 months ago

Incredible Richard Stallman vibe in this picture (this is a compliment)

Person replying to someone saying they are not a cult leader by comparing them to another person often seen as a cult leader.

[–] [email protected] 0 points 5 months ago

But he didn’t include punctuation! This must mean it’s a joke and that obviously he’s a cult leader. The funny hat (very patriarch like thing to have) thief should only count himself lucky that EY is too humble to send the inquisition after him.

Bless him, he didn’t even get angry.

load more comments (1 replies)
[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (3 children)
[–] [email protected] 0 points 5 months ago* (last edited 5 months ago)

Similar vibes in this crazy document

EDIT it's the same dude who was retweeted

https://situational-awareness.ai/

AGI by 2027 is strikingly plausible. GPT-2 to GPT-4 took us from ~preschooler to ~smart high-schooler abilities in 4 years. Tracing trendlines in compute (~0.5 orders of magnitude or OOMs/year), algorithmic efficiencies (~0.5 OOMs/year), and “unhobbling” gains (from chatbot to agent), we should expect another preschooler-to-high-schooler-sized qualitative jump by 2027.

Last I checked ChatGPT can't even do math, which I believe is a prerequisite for being considered a smart high-schooler. But what do I know, I don't have AI brain.

[–] [email protected] 0 points 5 months ago (1 children)

Straight line on a lin-log chart, getting crypto flashbacks.

[–] [email protected] 0 points 5 months ago (2 children)

I think technically the singularitarians were way ahead of them on the lin-log charts lines. Have a nice source (from 2005).

[–] [email protected] 0 points 5 months ago

I can't get over that the two axis are:

Time to the next event.

Time before present.

And then they have plotted a bunch of things happening with less time between. I can't even.

[–] [email protected] 0 points 5 months ago (1 children)

How am I ever going to work again, knowing that page is on the internet. Instead of Timecube, it's time squared.

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago)

I'm just amazed that they hate lin charts so much that the Countdown to SIN - lin chart is missing.

E: does seem to work when I directly go to the image, but not on the page. No human! You have a torch look down, there is a cliff! Ignore the siren cries of NFTs at the bottom! (Also look behind you, that woman with her two monkey friends is about to stab you in the back for some reason).

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (4 children)

i really, really don't get how so many people are making the leaps from "neural nets are effective at text prediction" to "the machine learns like a human does" to "we're going to be intellectually outclassed by Microsoft Clippy in ten years".

like it's multiple modes of failing to even understand the question happening at once. i'm no philosopher; i have no coherent definition of "intelligence", but it's also pretty obvious that all LLM's are doing is statistical extrapolation on language. i'm just baffled at how many so-called enthusiasts and skeptics alike just... completely fail at the first step of asking "so what exactly is the program doing?"

[–] [email protected] 0 points 5 months ago

The y-axis is absolute eye bleach. Also implying that an "AI researcher" has the effective compute of 10^6 smart high schoolers. What the fuck are these chodes smoking?

[–] [email protected] 0 points 5 months ago (1 children)

this article/dynamic comes to mind for me in this, along with a toot I saw the other day but don't currently have the link for. the toot detailed a story of some teacher somewhere speaking about ai hype, making a pencil or something personable with googly eyes and making it "speak", then breaking it in half the moment people were even slightly "engaged" with the idea of a person'd pencil - the point of it was that people are remarkably good at seeing personhood/consciousness/etc in things where it just outright isn't there

(combined with a bit of en vogue hype wave fuckery, where genpop follows and uses this stuff, but they're not quite the drivers of the itsintelligent.gif crowd)

[–] [email protected] 0 points 5 months ago (2 children)

https://mander.xyz/post/13749821 that's probably this not sure if it ever happened

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (2 children)

Transcript: a post by Greg Stolze on Bluesky.

I heard some professor put googly eyes on a pencil and waved it at his class saying "Hi! I'm Tim the pencil! I love helping children with their homework but my favorite is drawing pictures!"

Then, without warning, he snapped the pencil in half.

When half his college students gasped, he said "THAT'S where all this AI hype comes from. We're not good at programming consciousness. But we're GREAT at imagining non-conscious things are people."

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 0 points 5 months ago (1 children)

Same with when they added some features to the UI of gpt with the gpt40 chatbot thing. Don't get me wrong, the tech to do real time audioprocessing etc is impressive (but has nothing to do with LLMs, it was a different technique) but it certainly is very much smoke and mirrors.

I recall when they taught developers to be careful with small UI changes without backend changes as for non-insiders that feels like a massive change while the backend still needs a lot of work (so the client thinks you are 90% done while only 10% is done), but now half the tech people get tricked by the same problem.

[–] [email protected] 0 points 5 months ago

i suppose there is something more "magical" about having the computer respond in realtime, and maybe it's that "magical" feeling that's getting so many people to just kinda shut off their brains when creators/fans start wildly speculating on what it can/will be able to do.

how that manages to override people's perceptions of their own experiences happening right in front of it still boggles my mind. they'll watch a person point out that it gets basic facts wrong or speaks incoherently, and assume the fault lies with the person for not having the true vision or what have you.

(and if i were to channel my inner 2010's reddit atheist for just a moment it feels distinctly like the ways people talk about Christian Rapture, where flaws and issues you're pointing out in the system get spun as personal flaws. you aren't observing basic facts about the system making errors, you are actively in ego-preserving denial about the "inevitability of ai")

[–] [email protected] 0 points 5 months ago (1 children)

They're just one step away from "Ouija board as a Service"

[–] [email protected] 0 points 5 months ago (1 children)

Ouija Board, Sexy Lady Voice Edition

[–] [email protected] 0 points 5 months ago

Either sexy voice or the voice used in commercials for women and children. (I noticed a while back that they use the same tone of voice and that tone of voice now lowkey annoys me every time I hear it).

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (1 children)

Not a sneer, just a feelsbadman.jpg b.c. I know peeps who have been sucked into this "its all Joever.png mentality", (myself included for various we live in hell reasons, honestly I never recovered after my cousin explained to me what nukes were while playing in the sandbox at 3)

The sneerworthy content comes later:

1st) Rats never fail to impress with appeal to authority fallacy, but 2nd) the authority in question is max totally unbiased not a member of the extinction cult and definitely not pushing crank theories for decades fuckin' tegmark roflmaou

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (1 children)

"You know, we just had a little baby, and I keep asking myself... how old is he even gonna get?"

Tegmark, you absolute fucking wanker. If you actually believe your eschatological x-risk nonsense and still produced a child despite being convinced that he's going to be paperclipped in a few years, you're a sadistic egomaniacal piece of shit. And if you don't believe it and just lie for the PR, knowingly leading people into depression and anxiety, you're also a sadistic egomaniacal piece of shit.

[–] [email protected] 0 points 5 months ago

Truly I say unto you , it is easier for a camel to pass through the eye of a needle than it is to convince a 57 year old man who thinks he's still pulling off that leather jacket to wear a condom. (Tegmark 19:24, KJ Version)

[–] [email protected] 0 points 5 months ago (1 children)

for the sneerclub fans, this vid about MS Satoshi was pretty funny. All the Adam Something videos are entertaining for a couch sneer https://www.youtube.com/watch?v=dv4H4trnssc

[–] [email protected] 0 points 5 months ago (1 children)

Always a good sign when your big plan is virtually identical to L. Ron Hubbard's big plan in the 70s.

[–] [email protected] 0 points 5 months ago (1 children)

At least less people (allegedly) died in the remake plan.

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (1 children)

for now

crypto in general has caused pretty large number of casualties, depending on how you count

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (4 children)

Certainly, but I was specifically mentioning the woman who boarded the scientology boat, who has never been seen or heard of since who Scientology claims is fine, stop asking about her, she is doing great, better even! Allegedly. Like specific boat cult related things, not just a drill to head or locking down a hospital like the cryptocurrency deaths you hear about. (And even more indirectly than the hospital deaths, the slow cooking of the planet and upcoming climate disaster, welcome to the coolest summer of your life!)

[–] [email protected] 0 points 5 months ago (1 children)

about crypto deaths, i was thinking more in terms of sanctions evasion which fund iranian, nk and more recently russian weapon programs and continued existence of iranian morality police and such

[–] [email protected] 0 points 5 months ago

Sure those too, I wasn't trying to make an all inclusive list, the shit is bad.

load more comments (3 replies)
[–] [email protected] 0 points 5 months ago (3 children)

This gem from 25 year old Avital Balwit the Chief of Staff at Anthropic and researcher of "transformative AI at Oxford’s Future of Humanity Institute" discussing the end of labour as she knows it. She continues:

"The general reaction to language models among knowledge workers is one of denial. They grasp at the ever diminishing number of places where such models still struggle, rather than noticing the ever-growing range of tasks where they have reached or passed human level. [wherein I define human level from my human level reasoning benchmark that I have overfitted my model to by feeding it the test set] Many will point out that AI systems are not yet writing award-winning books, let alone patenting inventions. But most of us also don’t do these things. "

Ah yes, even though the synthetic text machine has failed to achieve a basic understanding of the world generation after generation, it has been able to produce ever larger volumes of synthetic text! The people who point out that it still fails basic arithmetic tasks are the ones who are in denial, the god machine is nigh!

🐍

[–] [email protected] 0 points 5 months ago* (last edited 5 months ago) (1 children)

Many will point out that AI systems are not yet writing award-winning books, […]

Holy shit, these chucklefucks are so full of themselves. To them, art and expression and invention are really just menial tasks which ought to be automated away, aren’t they? They claim to be so smart but constantly demonstrate they’re too stupid to understand that literature is more than big words on a page, and that all their LLMs need to do to replace artists is to make their autocomplete soup pretentious enough that they can say: This is deep, bro.

I can’t wait for the first AI-brained litbro trying to sell some LLM’s hallucinations as the Finnegans Wake of our age.

[–] [email protected] 0 points 5 months ago

Many will point out that magic eight balls are not yet writing award-winning books, let alone patenting inventions. But most of us also don’t do these things.

[–] [email protected] 0 points 5 months ago (1 children)

I maintain that people not having to work is a worst-case scenario for silicon valley VCs who rely on us being too fucking distracted by all their shit products to have time to think about whether we need their shit products

load more comments (1 replies)
load more comments (1 replies)
load more comments
view more: ‹ prev next ›