this post was submitted on 17 Sep 2024
208 points (97.3% liked)

Technology

59161 readers
2294 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 month ago (3 children)

I'm considering it, but only just, my 5800x is good enough for most gaming, which is GPU bound anyway, and I run a dual xeon rig for my workstation.

zen 2-4 took care of a lot of the demand, we all have 8-16 cores now, what else could they give us?

[–] [email protected] 3 points 1 month ago

what else could they give us?

AI!!!!!!!!

^^/s

[–] [email protected] 5 points 1 month ago (1 children)

They do still seem to be making advances in single-core performance, but whether it matters to most people is a different question. Most people aren't using software that would benefit that much from these generation-to-generation performance improvements. It's not going to be anywhere near as noticeable as when we went from 2 or 4 cores to 8, 16, 24, etc.

[–] [email protected] 5 points 1 month ago (1 children)

Single-thread is really hard, we've basically saturated our l1 working set size, adding more doesn't help much. Trying to extend the vector length just makes physical design harder and that reduces clock speed. The predictors are pretty good, and Apple finally kicked everyone up the ass to increase OOO like they should have.

Also, software still kind of sucks. It's better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

Flash was the epochal change, maybe we have some new form of hybrid storage but that doesn't seem likely right now, Apple might do it to cut costs while preserving performance, actually yeah I see them trying to have their cake and eat it too.

Otherwise I don't know, we need a better way to deal with GPUs, there's nothing else that can move the needle, except true heterogenous core clusters, but I haven't been able to sell that to anyone so far, they all think it's a great idea, that someone else should do.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago) (2 children)

Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

The incentives are all wrong for this, except in FOSS. It's never going to be a priority for Microsoft because everyone is used to the (lack of) speed of Windows, and "now a bit faster!" isn't a great marketing line. And it's not in the interests of hardware companies that need to keep shifting new boxes if the software doesn't keep bogging each generation down eventually. So we end up stuck with proprietary bloatware everywhere.

[–] [email protected] 4 points 1 month ago (1 children)

"what intel gives, microsoft takes away"

dates from the mid 90s, still relevant.

[–] [email protected] 2 points 1 month ago

Let's be fair, Ms was vastly outrunning Intel for a long time, it's only slowed down recently, and now the problem isn't single-thread bloat so much as it is an absolute lack of multicore scaling for almost all applications except some games, and even then windows fights as hard as it possibly can to stop you, like amd just proved yet again.

[–] [email protected] 1 points 1 month ago

Yes, mostly the applications aren't there, if you need real cpu power (or gpu for that matter), you're running linux or on the cloud.

But we are reaching a point where the desktop has to either be relegated to the level of embedded terminal (ie ugly tablet, before it's dropped altogether), or make the leap to genuine compute tool, and I fear we're going to see the former.

[–] [email protected] 5 points 1 month ago (1 children)

I have a 5900x and honestly don't see any need for an upgrade anytime soon.

A new CPU would maybe give me like 10 fps more in games, but a new GPU would do more. And I don't think the CPU will be a bottle neck in the next few years

[–] [email protected] 4 points 1 month ago (1 children)

Even beyond that, short of something like blender, Windows just can't handle that kind of horsepower, it's not designed for it and the UI bogs down fairly fast.

Linux, otoh, I find can eat as much CPU as you throw at it, but often many graphics applications start bogging down the X server for me.

So I have a windows machine with the best GPU but passable cpu and a decent workstation gpu with insane cpu power on linux.

[–] [email protected] 1 points 1 month ago (1 children)

What is your problem with Windows, though?

[–] [email protected] 1 points 1 month ago (1 children)

Meh, not nearly as configurable as linux, some things you can't change.

NFS beats SMB into a cocked hat.

You start spending more time in a terminal on linux, because you're not dealing with your machine, you're always connecting to other machines with their resources to do things. Yeah a terminal on windows makes a difference, and I ran cygwin for a while, it's still not clean.

Installing software sucks, either having to download or the few stuff that goes through a store. Not that building from source is much better, but most stuff comes from distro repos now.

Once I got lxc containers though, actually once I tried freebsd I lost my windows tolerance. Being able to construct a new effective "OS" with a few keystrokes is incredible, install progarms there, even graphical ones, no trace on your main system. There's just no answer.

Also plasma is an awesome DE.

[–] [email protected] 2 points 1 month ago (1 children)

Ah, ok, I thought you were taking about Windows not being able to run CPU at full speed. But yes, it's certainly a different OS with ups and downs.

[–] [email protected] 1 points 1 month ago

Well, it can't run multithreaded jobs at full speed.

Exhibit A: The latest AMD patch for multicore scheduling across NUMA.