this post was submitted on 07 Feb 2024
218 points (95.4% liked)
Technology
59429 readers
2968 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Do the LLMs have any knowledge of the effects of violence or the consequences of their decisions? Do they know that resorting to nuclear war will lead to their destruction?
I think that this shows that LLMs are not intelligent, in that they repeat what they've been fed, without any deeper understanding.
LLMs are redditors confirmed.
In fact they do not have any knowledge at all. They do make clever probability calculations but in the end of the day concepts like geopolitics and war are far more complex and nuanced than giving each phrase a value and trying to calculate it.
And even if we manage to create living machines, they‘ll still be human made, containing human flaws and likely not even by the best experts in these fields.
As in "an LLM doesn't model the domain of the conversation in any way, it just extrapolates what the hivemind says on the subject".