this post was submitted on 02 Nov 2024
1 points (100.0% liked)

TechTakes

1396 readers
20 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Spooky stuff that helps explain a lot of the dysfunction flowing out from Microsoft.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 week ago (2 children)

It seems that happens to management at every company, at various strength. I swear there must be a source for all this shit, like Forbes or something.

A side note:

"... It's all hallucination.”

prone to hallucination.

No, just no.

Everything generative AI produces is a hallucination.

Some may correlate with reality, but it is still a hallucination.

LLMentalist

[–] [email protected] 0 points 1 week ago

I think he's underestimating the intentionality at play here. The dynamic he's describing (and describing very well!) has been evident since the first chatbot, ELIZA. I don't believe that Saltman and friends don't know about this dynamic, and I'll give them benefit of the doubt that they didn't think we had AGI in the 80s with basic text templates.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

I don't like how people use "hallucinations" to refer to the output of neural networks, but you know what, it is all hallucination. It's hallucination on our part, looking at arbitrary sequences of tokens and seeing meaningful text. It's pareidolia, and it's powerful.