this post was submitted on 03 May 2025
1421 points (99.3% liked)

memes

14525 readers
4199 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 

Alt Text: an image of Agent Smith from The Matrix with the following text superimposed, "1999 was described as being the peak of human civilization in 'The Matrix' and I laughed because that obviously wouldn't age well and then the next 25 years happened and I realized that yeah maybe the machines had a point."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 1 day ago (1 children)

There's no technical reason to think we won't in the next ~20-50 years

Other than that nobody has any idea how to go about it? The things called "AI" today are not precursors to AGI. The search for strong AI is still nowhere close to any breakthroughs.

[–] [email protected] -2 points 1 day ago (1 children)

Assuming that the path to AGI involves something akin to all the intelligence we see in nature (i.e. brains and neurons), then modern AI algorithms' ability to simulate neurons using silicon and math is inarguably and objectively a precursor.

[–] [email protected] 0 points 14 hours ago* (last edited 14 hours ago)

Machine learning, renamed "AI" with the LLM boom, does not simulate intelligence. It integrates feedback loops, which is kind of like learning and it uses a network of nodes which kind of look like neurons if you squint from a distance. These networks have been around for many decades, I've built a bunch myself in college, and they're at their core just polynomial functions with a lot of parameters. Current technology allows very large networks and networks of networks, but it's still not in any way similar to brains.

There is separate research into simulating neurons and brains, but that is separate from machine learning.

Also we don't actually understand how our brains work at the level where we could copy them. We understand some things and have some educated guesses on others, but overall it's pretty much a mistery still.