this post was submitted on 13 Nov 2024
665 points (95.1% liked)
Technology
59429 readers
2967 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No shit. This was obvious from day one. This was never AGI, and was never going to be AGI.
Institutional investors saw an opportunity to make a shit ton of money and pumped it up as if it was world changing. They'll dump it like they always do, it will crash, and they'll make billions in the process with absolutely no negative repercussions.
Turns out AI isn't real and has no fidelity.
Machine learning could be the basis of AI but is anyone even working on that when all the money is in LLMs?
I'm not an expert, but the whole basis of LLM not actually understanding words, just the likelihood of what word comes next basically seems like it's not going to help progress it to the next level... Like to be an artificial general intelligence shouldn't it know what words are?
I feel like this path is taking a brick and trying to fit it into a keyhole...
Not necessarily, but it should be smart enough to associate symbols with some form of meaning. It doesn't do that, it juts associates symbols with related symbols, so if there's nothing similar that already exists, it's not going to be able to come back with anything sensible.
I think being able to create new content with partial sample data is necessary to really be considered general AI. That's what humans do, and we don't necessarily need the words to describe it.
learning is the basis of all known intelligence. LLMs have learned something very specific, AGI would need to be built by generalising the core functionality of learning not as an outgrowth of fully formed LLMs.
and yes the current approach is very much using a brick to open a lock and that's why it's ... ahem ... hit a brick wall.
Yeah, 20 something years ago when I was trying to learn PHP of all things, I really wanted to make a chat bot that could learn what words are... I barely got anywhere but I was trying to program the understanding of sentence structure and feeding it a dictionary of words... My goal was to have it output something on its own ...
I see these things become less resource intensive and hopefully running not on some random server...
I found the files... It was closer to 15 years ago...
Trying to invent artificial intelligence to learn php is quite funny lol
I'm amazed I still have the files... But yeah this was before all this shit was big... If I had a better drive I would have ended up more evil than zuck .. my plan was to collect data on everyone who used the thing and be able to build profiles on everyone based on what information you gave the chat ... And that's all I can really remember... But it's probably for the best...
Also a bit sadistic to be honest. Bringing a new form of life into the world only to subject it to PHP.
Right, so AIs don’t really know what words are. All they see are tokens. The tokens could be words and letters, but they could also be image/video features, audio waveforms, or anything else.