this post was submitted on 09 Jun 2024
1 points (100.0% liked)

TechTakes

1276 readers
30 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 
top 45 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 3 months ago (4 children)

I feel like the current Machine Learning gold rush is amazing from a technical perspective, but I feel like a lot of technophiles miss the real potential.

What we have is the first crickety engines of this technology. We're not building futurism masterpieces with the equivalent to a steam engine.

We have great new tools that we can use to further understand and optimize what we built, instead of just throwing more and more compute on top of our first design.

The human brain uses a light bulb of power, so we know we are massively inefficient. And recent research like MAMBA shows that there's still more improvements to make.

And Anthropic's Mech Interp shows there's ways to better understand the neural networks to improve performance without relying solely on a "black box".

The tech has great potential, but the massive server farms being dedicated to it now is just crypto style overhype and fear of missing out.

[–] [email protected] 0 points 3 months ago

did you even experience a single conscious thought while writing that? what fucking potential are you referring to? generating reams of scam messages and Internet spam? automating the only jobs that people actually enjoy doing? seriously, where is the thought?

[–] [email protected] 0 points 3 months ago

i implore the tech nerds to learn a modicum of biology before making very confident statements.

[–] [email protected] 0 points 3 months ago

Don't sell me potential if you're not a battery manufacturer.

[–] [email protected] 0 points 3 months ago

doesn't seem like it has great potential

[–] [email protected] 0 points 3 months ago (1 children)

cool graph what's the x axis

[–] [email protected] 0 points 3 months ago (1 children)
[–] [email protected] 0 points 3 months ago
[–] [email protected] 0 points 3 months ago

Surprised this isn't a bluecheck.

But maybe it's not visible.

[–] [email protected] 0 points 3 months ago (1 children)

Oh ez, that's only 17 orders of magnitude!

If we managed an optimistic pace of doubling every year that'd only take.... 40 years. The last few survivors on desert world can ask it if it was worth it

[–] [email protected] 0 points 3 months ago

Rather amusing prediction that despite the obscene amount of resources being spent on AI compute already, it's apparently reasonable to expect to spend 1,000,000x that in the "near future".

[–] [email protected] 0 points 3 months ago (1 children)

that looks like someone used win9x mspaint to make a flag, fucked it up, and then fucked it up even more on the saving throw

[–] [email protected] 0 points 3 months ago (1 children)

Any vexollologist around to confirm this?

[–] [email protected] 0 points 3 months ago

vexologist here. This certainly is vexing.

[–] [email protected] 0 points 3 months ago (2 children)

Or: Let's not do that at a time where our energy consumption is literally killing the planet we live on.

[–] [email protected] 0 points 3 months ago

You don't understand, after we invent ~~god~~ AGI all our problems are solved. Now step into the computroniuminator, we need your atoms for more compute.

[–] [email protected] 0 points 3 months ago (1 children)

Yeah I don't see why people are so blind on this. Computation is energy-intensive, and we are yet to optimize it for the energy. Yet, all the hopes..

[–] [email protected] 0 points 3 months ago

We do optimize, it's just that when you decrease the energy for computations by half, you just do twice the computations to iterate faster instead of using half the energy.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (3 children)

What these people don't realize is you're never gonna get AGI by just feeding a machine an infinite amount of raw data.

[–] [email protected] 0 points 3 months ago (2 children)

You sound very confident of that. Have you tried it?

[–] [email protected] 0 points 3 months ago

Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don't worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).

[–] [email protected] 0 points 3 months ago

There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel “Don’t Create The Torment Nexus” was nonsense. We shouldn’t be making policy decisions based off of that.

wild

[–] [email protected] 0 points 3 months ago (3 children)

You’re right. We should move onto feeding it orphans

[–] [email protected] 0 points 3 months ago

That reminds me, wonder if all the mods already got updated for the new version of rimworld.

[–] [email protected] 0 points 3 months ago

Check out my new startup at modestproposal.ai

[–] [email protected] 0 points 3 months ago

Oh, that’s why the orphan crushing machine exists. Completely realistic, actually.

[–] [email protected] 0 points 3 months ago (3 children)

Interesting. I recall a phenomenon by which inorganic matter was given a series of criterion and it adapted based on changes from said environment, eventually forming data which it then learned from over a period of millions of years.

It then used that information to build the world wide web in the lifetime of a single organism and cast doubt on others trying to emulate it.

But I see your point.

[–] [email protected] 0 points 3 months ago

there really is no limit on how bad an argument you types will leap to defend lol

[–] [email protected] 0 points 3 months ago (1 children)

Which at no point involved raw data. Laymen hubris.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (3 children)

Sorry, I don't necessarily agree with the other person, and the formation of organic compounds doesn't apply here anyway, but what would you call the sensory inputs that our brains filter and interpret?

[–] [email protected] 0 points 3 months ago

The sensory inputs are a continuous stream of environmental data.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago)

My dog does linear algebra everytime he pees on a fire hydrant so that he only pees for the exact amount of time needed. Similarly, when I drain my bath tub, it acts as a linear algebra machine that calculates how long it takes for the water to drain through a small hole.

Is this a fun way to look at the world that allows us to more readily build computational devices from our environment? Definitely. Is it useful for determining what is intelligence? Not at all.

[–] [email protected] 0 points 3 months ago (1 children)

Could you not make these kinds of stupid arguments just to score debate points?

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (2 children)

No, I'm not arguing anything other than that our brains receive raw data as inputs because they do. Now since we're jumping to insults immediately, you can kindly fuck off. Toodle-doo!

[–] [email protected] 0 points 3 months ago (1 children)

Where the fuck was the insult? Wild

You’re the one making incoherent illogical driveby comments, clown

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (1 children)

Attacking me as stupid straight out the gate was the insult, when all i said was "our brains process raw data as inputs". Falsify that if you want to argue. Now I'm very sorry you're not capable of understanding the point, but it isn't my problem. You can fuck off too, because I'm not here to instruct you in the English Comprehension equivalent of doing up your Velcro shoes, you genetic throwback.

[–] [email protected] 0 points 3 months ago

A masterful performance, an opus of ineptitude! Brava!

[–] [email protected] 0 points 3 months ago (3 children)

Yes, and that was a stupid argument unrelated to the point made that evolution used this raw data to do things, thus raw data in LLMs will lead to AGI. You just wanted debate points for 'see somewhere there is data in the process of things being alive'. Which is dumb gotcha logic which drags all of us down and makes it harder to have normal conversations about things. My reply was an attempt to make you see this and hope you would do better.

I didn't call you stupid, I called the argument stupid, but if the shoe fits.

[–] [email protected] 0 points 3 months ago

it was straight up "not even wrong"

[–] [email protected] 0 points 3 months ago

I didn't want "debate points", I wanted to know what you would call sensory inputs if not "raw data". Completely independent of anything else, which I tried to make clear in my post, the clarity which you completely ignored to accuse me of making a stupid argument. I made very specific effort to distance myself from the argument being made by the other poster, because I wanted to ask the one question and the one question alone, so to be lumped in with it anyway is more than galling.

Example: you lot just want to lash out at internet strangers for asking an honest question because it's in the wrong context as far as you're concerned. Is that a fair characterisation of your intent? No? Same. So you can take your accusations of intellectual dishonesty and this block, and fuck off.

[–] [email protected] 0 points 3 months ago (1 children)

No no see, since everything is information this argument totally holds up. That one would need to categorize and order it for it to be data is such a silly notion, utterly ridiculous and unnecessary! Just throw some information in the pool and stir, it’ll evolve soon enough!

[–] [email protected] 0 points 3 months ago (1 children)

The number of rocks in my garden is information. Yet, despite counting them all, I have not found AGI. So I must need more information than that.

Clearly, counting all the rocks in Wales should do it. So much counting.

[–] [email protected] 0 points 3 months ago* (last edited 3 months ago)

We stop this man from ending the world via AGI. Ah Ah Ah.

[–] [email protected] 0 points 3 months ago (1 children)

humans are just like linear algebra when you think about it

[–] [email protected] 0 points 3 months ago

Not me dawg, I am highly non linear (pls donate to my gofundme for spinal correction)