this post was submitted on 06 Mar 2024
1 points (100.0% liked)
TechTakes
1396 readers
22 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think some here are grossly overestimating average human capacity. There are many humans that have difficulty discerning the context of a statement based on their experiences aka examples.
This isn't AGI, but in another couple years at this pace, it's coming. Not necessarily because it is some higher mind, but because the metric for AGI is can it perform all the tasks our minds can at our level. Not necessarily Stephen Fry or Albert Einstein, just as well as a median asshole. Have you met us?
We aren't all that, and most of us spend most of our time on a script, sapience must be exercised, many do many don't, and isn't necessary for what we will abuse these for. It would probably be kinder to restrict discussion of such topics from memory when this matures. Even humans have great difficulty wrestling with them, to the point of depression and existential dread.
As people noted the last few AI autumns, this is a bad assumption. Winter is coming. S-curve, not exponential.
Seeing as the notion of "progress" in this space is entirely subjective and based on general vibes, it's easy to make a case for any curve shape.
I could make a passable argument that it's actually a noisy sinusoid.
When people say stuff like this it always makes me wonder "what pace, exactly?" Truthfully, I feel like hearing someone say "well, generative AI is such a fast-moving field" at this point is enough on its own to ping my BS detector.
Maybe it was forgivable to say it in May 2023, but at this point it definitely feels like progress has slowed down/leveled off. AI doesn't really seem to me to be significantly more capable than it was a year ago -- I guess OpenAI can generate videos now, but it's been almost a year since "will smith eating spaghetti," so...
I'm gonna be honest the videos did better than I expected, still meh in the weird uncanny valley aspect, but better than I expected. But still think that we have reached the end of the fast progress curve due to the whole gpt 4 is basically a couple of 3.5's chained together. Which I think is a sign of people running out of ideas, same as how in the era of multicore cpus the speed of cpus has not increased that drastically, and certainly not that noticeably (compared to the 90's etc).
What is going to be amazing however is the rise of 40k mechanicus style coders, I saw somebody go 'you don't need to know how to code, my program gave this http error, I didn't know what it meant, so I asked GPT how to fix it and implemented that and it works'. Amazing, bunch of servitors.
holy fuck please log off and go to therapy. I’m not fucking around. if this is actually how you see yourself and others, you are robbing yourself of the depth of the human experience by not seeking help.
If this is how you see yourself and others, you might want to touch some grass and meet some more humans outside the Internet.
the person there just commented on the average human's capacity for reasoning (not all humans, just the average one), and, in all fairness, they're sort of right, I think
don't just think of your friends and family, but about all humans. think about what makes it in the news and then how many things don't make it. religious nuts stoning people for whatever reason, gang sexual assault in the street in certain areas of the world, people showing up in ERs with weird stuff up their back ends, or finding unexploded ordnance from wars past and deciding the best course of action would be to smash it with a hammer or drill into it. this is all of course in addition to the pressing issues nowadays which do also seem to come from a place of not exercising sapience.
and for the less extreme cases, I do think the original commenter here is correct in saying people do tend to follow scripts and glide through life.
“MY friends and family are of course sapient, but all of those morons I see on TV and sometimes have to go to meetings with, those are dumber than ChatGPT obviously“ honey do you hear yourself
if you think about selection bias, namely that one normally chooses to surround oneself with like-minded people and if you add the fact that people would normally not consider themselves non-sapient, it sort of makes sense though, dunnit?
family, true, you don't choose that, but I figure statistically people are more likely to have some strong feelings about their family and implications towards themselves if they admit their family is indeed non-sapient (though blood ties are a different topic, best left undisturbed in this context)
for the record I never said MY friends and family, I was instructing the other commenter to look beyond their own circle. I figured since they were so convinced that the average human was not, in fact, about as dumb as an LLM, their social circle skews their statistics a bit.
shit, find me the stupidest dog you know and i'll show a being that is leagues beyond a fucking chatbot's capabilities. it can want things in the world, and it can act of its own volition to obtain those things. a chatbot is nothing. it's noise. fuck that. if you can't see it it's because you don't know to look at the world.
human beings are smart. bad things don't happen because people are stupid. this kind of thinking is deshumanising and leads to so much evil in our world. people are not LLMs. they're people like you. they have thoughts. they act for reasons. don't dehumanise them.
I would point you to Hanlon's razor for the first part there.
it's not about dehumanizing, it's merely comparing the outputs. it doesn't really matter if they act for reasons or have thoughts if the output is the same. should we be more forgiving if a LLM outputs crap because it's just a tool or should we be more forgiving if the human outputs the exact same crap, because it's a person?
and, just for fun, to bring solipsism into this, how do we actually know that they have thoughts?
Wow, it's not like many centuries have been spent with fields of philosophy investigating what cognition and consciousness are, good thing we have a bunch of also-ran STEM dweebs to reinvent philosophy of mind from the first principle of "Idiocracy was a documentary."
how old are you
is this the post where the flaming starts then?
no i just wanted to know before calling you a hitler. maybe you can still grow and save yourself.
waaait... are you a LLM? have I been arguing with ChatGPT this whole time? good one, whomever pulled this!
I cannot upvote this enough. There are no human NPCs.
Also, there are people in my social circle who are developmentally disabled and they are also sapient, what the actual fuck.
A computed can multiply matrices and most humans can't, that doesn't mean an algorithm is more sapient than a human. A tinker toy computer can reliably win or tie tic-tac-toe, and it's not more sapient than a developmentally disabled human who can't.
ummm, you're the only one here that made any assumption about the sapience of developmentally disabled people, no idea where or why that came from
I would expect the people in your social circle to be sapient according to yourself, please see my initial point about selecting the ones you surround yourself with
tic-tac-toe is a solved game, so it would be expected for a computer to always win or tie, that says more about the game itself though
You defined sapience as capacity for reasoning you absolute clown.
are you saying developmentally disabled people are incapable of reasoning? that's a bit rude of you...
oh do fuck off
that didn't take long
I can't tell if I'm specifically good at provoking people to invoke the banhammer upon their own heads because
I get serious and forget to keep it to snark, and then they do, too, and then I'm in a Someone Is Wrong On The Internet fight with someone who otherwise would have been a drive-by dickbag troll,
I post under an explicitly female name, or
I'm more of an asshole than I think I am.
Occam's razor says all three TBH.
regardless of the mechanism of action, you are very good at extracting comedically bannable posts from terrible people in record time. it’s very funny how quick their pseudointellectual schtick fell apart when they failed to process concepts like “humans aren’t NPCs” and “having developmentally disabled friends you treat with respect”
God, no, this incentive system is terrible. I'm turning into my dad and I'm getting stickers for it.
(Seriously now I am imagining my dad with the power to provoke people into publicly cancellable behavior with the power of very cranky letters to the editor. He'd have been unstoppable.)
nah, if anything this A.I. craze has made me appreciate how incredibly smart even the supposedly dimmest of humans are. we can use language of our own volition, to create meaning. in fact we frigging invented it!!! we're just bloody amazing, to hell with misanthropy.
Pace my blog post, these last few years have shown diminishing returns on "AI":
So... the idea here is that OpenAI and friends are gonna charge you N bucks a month so you can have chat conversations with the average internet user? Spoiler alert: that service is already free.
For $150 they save you the inconvenience of finding the style of twit you wish to interact with, and will dress up to whatever twit your heart desires!
In the beginning.. I can’t wait to see what happens to their pricing when they believe they’ve locked enough in and shift from vc subsidy to actual customer-carried charge. Bet it’s gonna be real popular..
Apparently the electric power generation in the US is under strain because of all the AI server farms being feverishly built by entrepreneurs with FOMO. The bill is gonna come due some day, especially if Joe and Jill Sixpack can't afford to cool their beer because of some egghead generating pr0n.
What is your source for this? Cryptocurrency is the only single source of computation that is significant on its own on a world scale (131 TWh per year for bitcoin alone which is most of it); AI's expected to become significant as it gets more and more popular, but the last I saw it wasn't there yet.
My source is the Washington Post: https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/
tbf AI isn't quite there yet. If AI continues as it is and bitcoin doesn't somehow drop off, AI will be worse by 2027 IIRC.