this post was submitted on 28 Feb 2025
1 points (100.0% liked)

TechTakes

1750 readers
19 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zetta@mander.xyz 0 points 1 month ago (5 children)

I don't think the bubble will pop as long as "Ai" keeps advancing at the pace it is. LLMs, text to photo, and text to video models are getting exponentially better year over year. I really don't think the hype train is going to slow down until the rate of progress slows as well, so far there aren't an indications the rate of progress is going to slow.

Wild guess here that I'm sure you and others will disagree with, even when the bubble "pops" it won't really be a pop but more of a downturn that doesn't actually hurt any of the big players significantly.

Only time will tell, it will certain be interesting to watch as an outsider no-matter what.

[–] dgerard@awful.systems 0 points 1 month ago (1 children)

The Information's Dealmaker newsletter is talking today about downrounds in AI venture funding

[–] froztbyte@awful.systems 0 points 1 month ago

downrounds in AI venture funding

sickos meme image

[–] sc_griffith@awful.systems 0 points 1 month ago* (last edited 1 month ago)

LLMs, text to photo, and text to video models are getting exponentially better year over year.

it is 2025 how are you still saying this shit

picture of cat looking very tired

[–] Amoeba_Girl@awful.systems 0 points 1 month ago

LLMs, text to photo, and text to video models are getting logarithmically better year over year.

[–] unexposedhazard@discuss.tchncs.de 0 points 1 month ago (1 children)

This article is literally about the fact that progress has stagnated...

There are clearly fundamental issues with the approach they have been using for LLMs. There is no new data left, the entire internet has been scraped already. All thats left is incremental improvements in the way they process the data.

[–] Ledivin@lemmy.world 0 points 1 month ago* (last edited 1 month ago) (1 children)

This article is literally about the fact that progress has stagnated...

OpenAI is stagnating, and has been for at least a few months, now. The AI industry as a whole has only continued to accelerate, especially with the new blood that is DeepSeek coming into play.

[–] Amoeba_Girl@awful.systems 0 points 1 month ago (1 children)

I hear ethereum is going to solve all of bitcoin's problems

[–] o7___o7@awful.systems 0 points 1 month ago (1 children)

Wait until we get the Strategic Ape Reserve, then you'll see!

[–] froztbyte@awful.systems 0 points 1 month ago
[–] KitB@feddit.uk 0 points 1 month ago

In my experience they've significantly tailed off over the past year, exponential growth would mean the amount they get better per unit time increases over time. What has gotten better is our ability to run the same level of things on cheaper hardware with less power, again just in my limited experience. (Also this is not the definition of exponential growth, just a property of it. Polynomial growth has the same property)