this post was submitted on 17 Dec 2024
574 points (92.6% liked)
memes
10666 readers
2616 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I gave it a math problem to illustrate this and it got it wrong
If it can’t do that imagine adding nuance
Ymmv i guess. I've given it many difficult calculus problems to help me through and it went well
Well, math is not really a language problem, so it's understandable LLMs struggle with it more.
But it means it’s not “thinking” as the public perceives ai
Hmm, yeah, AI never really did think. I can't argue with that.
It's really strange now if I mentally zoom out a bit, that we have machines that are better at languange based reasoning than logic based (like math or coding).
Not really true though. Computers are still better at math. They're even pretty good at coding, if you count compiling high-level code into assembly as coding.
But in this case we built a language machine to respond to language with more language. Of course it's not going to do great at other stuff.