No, it turns out that lying to the consumer about old tech is profitable.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Hatebait. Adds nothing informative to the thread.
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
Which itself is a gimmick, they've just made the gates taller, electron leakage would happen otherwise.
NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.
It's now a title / name of a process and not representative of how small the transistors are.
I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.
Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
technology is still advancing
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.
AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.
Not exactly, but smaller nodes are getting really expensive. So they could make a "slim" version with a lower power unit, but it would likely cost more than the original.
Wtf, that headline is fucking backwards thinking and capitalistic. If you’re not greedy and don’t have unnecessary high standards that doesn’t make a game, you’re the problem. Sorry not sorry but gamers demand and the companies are at fault here.