this post was submitted on 15 May 2024
275 points (96.3% liked)

Science Memes

11047 readers
3584 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
all 15 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 6 months ago (2 children)

Okay so I'm guessing here, and if my guess is right I'm just not familiar enough with bio-sciences to really be sure.

Are these neurons?

And is the joke "These neurons are all over the place so it's no wonder my thoughts are too"?

Genuine question. I'm more of an engineer than biologist.

[–] [email protected] 7 points 6 months ago

These are the neurons from a cubic millimeter of brain tissue.

[–] [email protected] 9 points 6 months ago (1 children)

Yes, it's a reference to an article that's been posted here on lemmy the other day: https://lemmy.world/post/15229790

[–] [email protected] 1 points 6 months ago

Neat! I missed this!

[–] [email protected] 13 points 6 months ago

r/cablegore

[–] [email protected] 51 points 6 months ago (6 children)

I have a ~~theory~~ ~~hypothesis~~ notion that the concept of hallucination in artificial neural networks is not a failure mode that is unique to ANNs but is an inherent property of any neural network, artificial or biological.

Essentially, I posit that a neural network by itself is incapable of maintaining coherence without a rigid external framework, such as consistent feedback in training an ANN, or the laws of physics for biologicals.

This would explain why people start tripping balls in sensory deprivation chambers. And it provides a counterargument to any thought experiment or philosophy that involves a disembodied brain vividly hallucinating reality.

[–] [email protected] 16 points 6 months ago (2 children)

On the one hand, that's a cool insight and I can get behind it. It's kind of similar to deaf people talking "weird". On the other hand, I don't think it has anything to do with LLMs. There, hallucination is just a cool word for "it's trained to say things that sound like they fit the context, not to be correct"

[–] [email protected] 5 points 6 months ago

Is that not the same thing? If hallucinations are basically unrestricted activity, and "hallucinations" in LLM are the result of insufficient restrictions in training or prompts, then are they not real hallucinations?

[–] [email protected] 0 points 6 months ago

Thing is, our brains could work the exact same way…. Only they’re constantly being trained, and have enough neurons that many clusters can be dedicated to very specific contexts.

Today is _____

Well, given the context that my phone says 1:20, and it’s dark, when I fell asleep it was Tuesday, and that Wednesday comes after Tuesday…. Plus all the necessary training that allowed me to understand all that context 24/hours in a day; days start at 12:00, 1 comes after 12 but only in our time of day system.

[–] [email protected] 1 points 6 months ago

Its either a counter argument or the best backup for a disembodied brain hallucinating everything.

[–] [email protected] 3 points 6 months ago

Ironically not too far from Aldous Huxley’s theory about human perception:

https://en.wikipedia.org/wiki/Mind_at_Large

Essentially reality is or contains all the properties of hallucination but our brain filters it, and psychedelic drugs in some way dilute or remove that filter.

So the human brain is sort of by default filtering the “hallucination” version of thought until we open that up, and ANNs begin with that at baseline, and then require rigor added to them to reduce the “hallucination”

[–] [email protected] 7 points 6 months ago

That's really interesting! I guess I'll incorporate this into my worldview now.

[–] [email protected] 2 points 6 months ago

Interesting comment. Thanks 🙂

[–] [email protected] 3 points 6 months ago

You're right, we should give all AI advancements from now on a body.