this post was submitted on 17 Dec 2024
574 points (92.6% liked)

memes

10666 readers
2616 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to [email protected]

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 4 days ago (2 children)

I gave it a math problem to illustrate this and it got it wrong

If it can’t do that imagine adding nuance

[–] [email protected] -1 points 3 days ago* (last edited 3 days ago)

Ymmv i guess. I've given it many difficult calculus problems to help me through and it went well

[–] [email protected] 11 points 4 days ago (1 children)

Well, math is not really a language problem, so it's understandable LLMs struggle with it more.

[–] [email protected] 11 points 4 days ago (1 children)

But it means it’s not “thinking” as the public perceives ai

[–] [email protected] 5 points 4 days ago (2 children)

Hmm, yeah, AI never really did think. I can't argue with that.

It's really strange now if I mentally zoom out a bit, that we have machines that are better at languange based reasoning than logic based (like math or coding).

[–] [email protected] 1 points 3 days ago

Not really true though. Computers are still better at math. They're even pretty good at coding, if you count compiling high-level code into assembly as coding.

But in this case we built a language machine to respond to language with more language. Of course it's not going to do great at other stuff.