this post was submitted on 03 Nov 2024
270 points (98.9% liked)

Technology

59137 readers
2280 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Panther Lake and Nova Lake laptops will return to traditional RAM sticks

(page 2) 24 comments
sorted by: hot top controversial new old
[–] [email protected] 40 points 3 days ago* (last edited 3 days ago) (3 children)

I don't think Lunar lake wasn't a "mistake" so much as it was a reaction. Intel couldn't make a competitive laptop chip to go up against Apple and Qualcomm. (There is a very weird love triangle between the three of them /s.) Intel had to go to TSMC to get a chip to market that satisfied this AI Copilot+ PC market boom(or bust). Intel doesn't have the ability to make a competitive chip in that space (yet) so they had to produce lunar lake as a one off.

Intel is very used to just giving people chips and forcing them to conform their software to the available hardware. We're finally in the era where the software defines what the cpu needs to be able to do. This is probably why Intel struggles. Their old market dominant strategy doesn't work in the CPU market anymore and they've found themselves on the back foot. Meanwhile new devices where the hardware and software are deeply integrated in design keep coming out while Intel is still swinging for the "here's our chip, figure it out for us" crowd.

In contrast to their desktop offerings, looking at Intel's server offerings shows that Intel gets it. They want to give you the right chips for the right job with the right accelerators.

He's not wrong that GPUs in the desktop space are going away because SoCs are inevitably going to be the future. This isn't because the market has demanded it or some sort of conspiracy, but literally we can't get faster without chips getting smaller and closer together.

Even though I'm burnt on Nvidia and the last two CPUs and GPUs I've bought have been all AMD, I'm excited to see what Nvidia and mediatek do next as this SOC future has some really interesting upsides to it. Projects like ashai Linux proton project and apple GPTK2 have shown me the SoC future is actually right around the corner.

Turns out, the end of the x86 era is a good thing?

load more comments (3 replies)
[–] [email protected] 3 points 3 days ago

They have to try revive their idea like they did in intel core i7 8709g first though

[–] [email protected] 106 points 3 days ago (2 children)

coming up next: Intel fires 25% of their staff, CEO gets a quarterly bonus in the millions

load more comments (2 replies)
[–] [email protected] 6 points 3 days ago (1 children)

I see the idea of Intel dropping arc as good news for AMD. Intel was going to chip at AMD’s marketshare well before Nvidia’s. It would be better to have more competition though.

[–] [email protected] 17 points 3 days ago (2 children)

AMD would never close their GPU department because they sell their apu to Xbox, playstation, steam deck

load more comments (2 replies)
[–] [email protected] 9 points 3 days ago

Blaming loss on SoC? Lmfao. SoC is better. Just stop offering a lower tier and make all SoC 32gb+

… looking at you too, Apple.

[–] [email protected] 33 points 4 days ago (1 children)

And here I was thinking Arc and storage were the only semi-competitive wings of intel... They just needed a couple of years for adoption to increase

[–] [email protected] 16 points 3 days ago* (last edited 3 days ago) (3 children)

I've commented many times that Arc isn't competitive, at least not yet.
Although they were decent performers, they used twice the die size for similar performance compared to Nvidia and AMD, so Intel has probably sold them at very little profit.
Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too.
But maybe that's the reason Intel recently admitted they couldn't compete with Nvidia on high end AI?

[–] [email protected] 10 points 3 days ago (8 children)

Arcs are OK, and the competition is good. Their video encode performance is absolutely unworldly though, just incredible.

Mostly, they help bring the igpu graphics stack and performance up to full, and keep games targeting them well. They're needed for that alone if nothing else.

load more comments (8 replies)
[–] [email protected] 3 points 3 days ago (3 children)

Yeah true, plus I bought my a770 at pretty much half price during the whole driver issues and so eventually got a 3070 performing card for like $250, which is an insane deal for me but no way intel made anything on it after all the rnd and production costs

The main reason Intel can't compete is the fact CUDA is both proprietary and the industry standard, if you want to use a library you have to translate it yourself which is kind of inconvenient and no datacentre is going to go for that

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 94 points 4 days ago (5 children)

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

No wonder Intel is in such rough shape! Gelsinger is an idiot.

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

The board needs to fire his ass ASAP and replace him with someone who has a grip on reality. Or at least someone who has a some imagination of how the future could be.

[–] [email protected] 5 points 3 days ago (1 children)

Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

Maybe the idea is to put it on the CPU/NPU instead? Hence them going so hard on AI processors in the CPU, even though basically nothing uses it.

[–] [email protected] 9 points 3 days ago (1 children)

But if he wants npu then why not buff igpu too? I mean, igpu exclusive on CPU memory is good boost, look up intel i7 8709g they put AMD Radeon vega igpu and exclusive to igpu 4gb of hbm memory, it did wonders, now when AMD is winning in apu sector, they could utilise same ideas they did in the past

load more comments (1 replies)
[–] [email protected] 69 points 4 days ago* (last edited 4 days ago) (2 children)

Gelsinger said the market will have less demand for dedicated graphics cards in the future.

Reminds me of decades ago when intel didn't bother getting into graphics because they said pretty soon CPUs would be powerful enough for high-performance graphics rendering lmao

The short-sightedness of Intel absolutely staggers me.

[–] [email protected] 47 points 3 days ago* (last edited 3 days ago) (2 children)

CPUs would be powerful enough for high-performance graphics rendering lmao

And then they continued making 4 core desktop CPU's, even after phones were at deca-core. 🤣🤣🤣

load more comments (2 replies)
[–] [email protected] 25 points 4 days ago

It's been the same "vision" since the late 90s - the CPU is the computer and everything else is peripherals.

load more comments (3 replies)
[–] [email protected] 9 points 4 days ago* (last edited 4 days ago) (3 children)

I'm wondering, the integrated RAM like Intel did for Lunar Lake, could the same performance be achieved with the latest CAMM modules? The only real way to go integrated to get the most out of it is doing it with HBM, anything else seems like a bad trade-off.

So either you go HBM with real bandwidth and latency gains or CAMM with decent performance and upgradeable RAM sticks. But the on-chip ram like Intel did is neither providing the HBM performance nor the CAMM modularity.

[–] [email protected] 2 points 3 days ago

Look up intel i7 8709g

[–] [email protected] 13 points 4 days ago (1 children)

I wonder why both isn't possible, build some into the chip but leave some DIMMs for upgradeability too at bit lower speed.

load more comments (1 replies)
[–] [email protected] 4 points 4 days ago* (last edited 4 days ago)

The transfer speed isn't the big issue, it's the density and reliability. Packing more heat generating stuff onto the SoC package just makes it more difficult to dissipate. The transfer of data to where it needs to be is still the same, so the trade-off is pretty null in that sense except reduction of overall power consumption.

[–] [email protected] 119 points 4 days ago (1 children)

Reverting to RAM sticks is good, but not shutting down GPU line. GPU market needs more competiter, not less.

[–] [email protected] 28 points 4 days ago (7 children)

Intel can't afford to keep making GPUs because it doesn't have the reliable CPU side to soak up the losses. The GPU market has established players and Intel, besides being a big name, didn't bring much to the table to build a place for itself in the market. Outside of good Linux support (I've heard, but not personally used) the Intel GPUs don't stand out for price or performance.

Intel is struggling with its very existence and doesn't have the money or time to explore new markets when their primary product is cratering their own revenue. Intel has a very deep problem with how it is run and will most likely be unable to survive as-is for much longer.

[–] [email protected] 21 points 3 days ago (5 children)

It boggles the mind that AMD realized the importance of GPUs 20 years ago when they bought ATI and in all that time Intel still doesn’t have a competitive GPU.

load more comments (5 replies)
[–] [email protected] 25 points 4 days ago* (last edited 3 days ago) (3 children)

As a Linux user of an Intel Arc card. I can safely say that the support is outstanding. In terms of price to performance, I think it’s pretty good too. I mainly enjoy having 16GB of VRAM and not spending $450-$500+ to get that amount like Nvidia. I know AMD also has cards around the same price that have that amount of VRAM too though

load more comments (3 replies)
load more comments (5 replies)
load more comments
view more: ‹ prev next ›