It’s like asking if you think a calculator is smarter than you.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
„It‘s totally a lot smarter than I am, no way could I deliver (234 * 534)^21 as confidently!“
"Half of LLM users " beleive this. Which is not to say that people who understand how flawed LLMs are, or what their actual function is, do not use LLMs and therefore arent i cluded in this statistic?
This is kinda like saying '60% of people who pay for their daily horoscope beleive it is an accurate prediction'.
You say this like this is wrong.
Think of a question that you would ask an average person and then think of what the LLM would respond with. The vast majority of the time the llm would be more correct than most people.
A good example is the post on here about tax brackets. Far more Republicans didn't know how tax brackets worked than Democrats. But every mainstream language model would have gotten the answer right.
LLMs are smart, they are just not intelligent
I had to tell a bunch of librarians that LLMs are literally language models made to mimic language patterns, and are not made to be factually correct. They understood it when I put it that way, but librarians are supposed to be "information professionals". If they, as a slightly better trained subset of the general public, don't know that, the general public has no hope of knowing that.
It's so weird watching the masses ignore industry experts and jump on weird media hype trains. This must be how doctors felt in Covid.
They're right
“Think of how stupid the average person is, and realize half of them are stupider than that.” ― George Carlin
While this is pretty hilarious LLMs don't actually "know" anything in the usual sense of the word. An LLM, or a Large Language Model is a basically a system that maps "words" to other "words" to allow a computer to understand language. IE all an LLM knows is that when it sees "I love" what probably comes next is "my mom|my dad|ect". Because of this behavior, and the fact we can train them on the massive swath of people asking questions and getting awnsers on the internet LLMs essentially by chance are mostly okay at "answering" a question but really they are just picking the next most likely word over and over from their training which usually ends up reasonably accurate.
I'm 100% certain that LLMs are smarter than half of Americans. What I'm not so sure about is that the people with the insight to admit being dumber than an LLM are the ones who really are.
A daily bite of horror.
LLMs are smart in the way someone is smart who has read all the books and knows all of them but has never left the house. Basically all theory and no street smarts.
A broken clock is right two times a day I suppose.
They're not even that smart.
Well yes, they are glorified text autocomplete, but they still have their uses which could be considered "smart". For example I was struggling with a programming thing today and an LLM helped me out, so in a way it is smarter than me in that specific thing. I think it's less that they are dumb and more that they have no agency whatsoever, they have to be pushed into the direction you want. Pretty annoying..
There’s a lot of ignorant people out there so yeah, technically LLM is smarter than most people.
No one has asked so I am going to ask:
What is Elon University and why should I trust them?
Ironic coincidence of the name aside, it appears to be a legit bricks and mortar university in a town called Elon, North Carolina.
They're right. AI is smarter than them.
An llm simply has remembered facts. If that is smart, then sure, no human can compete.
Now ask an llm to build a house. Oh shit, no legs and cant walk. A human can walk without thinking about it even.
In the future though, there will be robots who can build houses using AI models to learn from. But not in a long time.
3d-printed concrete houses are already a thing, there's no need for human-like machines to build stuff. They can be purpose-built to perform whatever portion of the house-building task they need to do. There's absolutely no barrier today from having a hive of machines built for specific purposes build houses, besides the fact that no-one as of yet has stitched the necessary components together.
It's not at all out of the question that an AI can be trained up on a dataset of engineering diagrams, house layouts, materials, and construction methods, with subordinate AIs trained on the specific aspects of housing systems like insulation, roofing, plumbing, framing, electrical, etc. which are then used to drive the actual machines building the house. The principal human requirement at that point would be the need for engineers to check the math and sign-off on a design for safety purposes.
If you trained it on all of that it wouldn't be a good builder. Actual builders would tell you it's bad and you would ignore them.
LLMs do not give you accurate results. They can simply strong along words into coherent sentences and that's the extent of their capacity. They just agree with whatever the prompter is pushing and it makes simple people think it's smart.
AI will not be building you a house unless you count a 3D printed house and we both know that's overly pedantic. If that were the case a music box from 1780 is an AI.
The funny thing about this scenario is by simply thinking that’s true, it actually becomes true.
Because an LLM is smarter than about 50% of Americans.
*as long as your evaluation of "smart" depends on summerizing search results
Have you asked the average person to summarize...well anything?
The equivalent would be asking the average person to write a cited paper on a subject in a month.
Maybe even more.