this post was submitted on 28 Jun 2025
961 points (94.7% liked)

Technology

72669 readers
3233 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.

But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.

This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.

So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.

Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).

Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.

https://archive.ph/Fapar

(page 8) 16 comments
sorted by: hot top controversial new old
[–] [email protected] 62 points 1 week ago (4 children)

Good luck. Even David Attenborrough can't help but anthropomorphize. People will feel sorry for a picture of a dot separated from a cluster of other dots. The play by AI companies is that it's human nature for us to want to give just about every damn thing human qualities. I'd explain more but as I write this my smoke alarm is beeping a low battery warning, and I need to go put the poor dear out of its misery.

load more comments (4 replies)
[–] [email protected] 24 points 1 week ago (9 children)

Steve Gibson on his podcast, Security Now!, recently suggested that we should call it "Simulated Intelligence". I tend to agree.

load more comments (9 replies)
[–] [email protected] -2 points 1 week ago (30 children)

Philosophers are so desperate for humans to be special. How is outputting things based on things it has learned any different to what humans do?

We observe things, we learn things and when required we do or say things based on the things we observed and learned. That's exactly what the AI is doing.

I don't think we have achieved "AGI" but I do think this argument is stupid.

[–] [email protected] 10 points 1 week ago* (last edited 1 week ago) (7 children)

Yes, the first step to determining that AI has no capability for cognition is apparently to admit that neither you nor anyone else has any real understanding of what cognition* is or how it can possibly arise from purely mechanistic computation (either with carbon or with silicon).

Given the paramount importance of the human senses and emotion for consciousness to “happen”

Given? Given by what? Fiction in which robots can't comprehend the human concept called "love"?

*Or "sentience" or whatever other term is used to describe the same concept.

load more comments (7 replies)
[–] [email protected] 4 points 1 week ago (13 children)

Most people, evidently including you, can only ever recycle old ideas. Like modern "AI". Some of us can concieve new ideas.

load more comments (13 replies)
load more comments (28 replies)
[–] [email protected] 6 points 1 week ago (1 children)

Artificial Intelligent is supposed to be intelligent.

Calling LLMs intelligent is where it's wrong.

[–] [email protected] 12 points 1 week ago* (last edited 1 week ago) (2 children)

Artificial Intelligent is supposed to be intelligent.

For the record, AI is not supposed to be intelligent.

It just has to appear intelligent. It can be all smoke-and-mirrors, giving the impression that it's smart enough - provided it can perform the task at hand.

That's why it's termed artificial intelligence.

The subfield of Artificial General Intelligence is another story.

load more comments (2 replies)
[–] [email protected] -1 points 1 week ago* (last edited 1 week ago) (4 children)

Thank You! Yes!

So ... A-not-I? AD? What do we call it? LLM seems too specialised?

[–] [email protected] 5 points 1 week ago (1 children)

Autocomplete on steroids, but suffering dementia.

[–] [email protected] 4 points 1 week ago
[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (1 children)

AS - artificial stupidity

ASS - artificial super stupidity

[–] [email protected] 1 points 1 week ago

Both are good 👍

[–] [email protected] 3 points 1 week ago

I prefer the term "sophisticated text completion".

load more comments (1 replies)
[–] [email protected] 34 points 1 week ago (10 children)

I've never been fooled by their claims of it being intelligent.

Its basically an overly complicated series of if/then statements that try to guess the next series of inputs.

[–] [email protected] 12 points 1 week ago (5 children)

ChatGPT 2 was literally an Excel spreadsheet.

I guesstimate that it's effectively a supermassive autocomplete algo that uses some TOTP-like factor to help it produce "unique" output every time.

And they're running into issues due to increasingly ingesting AI-generated data.

Get your popcorn out! 🍿

[–] [email protected] 3 points 1 week ago* (last edited 1 week ago)

And they’re running into issues due to increasingly ingesting AI-generated data.

There we go. Who coulda seen that coming! While that's going to be a fun ride, at the same time companies all but mandate AS* to their employees.

load more comments (4 replies)
load more comments (9 replies)
load more comments
view more: ‹ prev next ›