this post was submitted on 17 Jul 2024
689 points (99.1% liked)

PC Gaming

8183 readers
435 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 month ago

Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs

[–] [email protected] 1 points 1 month ago

Predictable outcome, common tech company L.

[–] [email protected] 6 points 1 month ago

The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.

[–] [email protected] 17 points 1 month ago

I'm willing to pay extra for software that isn't

[–] [email protected] 21 points 1 month ago (2 children)

Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

[–] [email protected] 9 points 1 month ago

I already moved to Linux. Windows is basically doing this already.

[–] [email protected] 8 points 1 month ago

One word. Linux.

[–] [email protected] 21 points 1 month ago (4 children)

Who in the heck are the 16%

[–] [email protected] 1 points 1 month ago

I would if the hardware was powerful enough to do interesting or useful things, and there was software that did interesting or useful things. Like, I'd rather run an AI model to remove backgrounds from images or upscale locally, than to send images to Adobe servers (this is just an example, I don't use Adobe products and don't know if this is what Adobe does). I'd also rather do OCR locally and quickly than send it to a server. Same with translations. There are a lot of use-cases for "AI" models.

[–] [email protected] 5 points 1 month ago

I'm interested in hardware that can better run local models. Right now the best bet is a GPU, but I'd be interested in a laptop with dedicated chips for AI that would work with pytorch. I'm a novice but I know it takes forever on my current laptop.

Not interested in running copilot better though.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago)

Maybe people doing AI development who want the option of running local models.

But baking AI into all consumer hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn't offer the privacy of local AI, networked local AI on devices you don't fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.

[–] [email protected] 16 points 1 month ago (3 children)
  • The ones who have investments in AI

  • The ones who listen to the marketing

  • The ones who are big Weird Al fans

  • The ones who didn't understand the question

[–] [email protected] 1 points 1 month ago
  • The nerds that care about privacy but want chatbots or better autocomplete
[–] [email protected] 5 points 1 month ago

Those Weird Al fans will be very disappointed

[–] [email protected] 10 points 1 month ago

I would pay for Weird-Al enhanced PC hardware.

[–] [email protected] 9 points 1 month ago

A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).

[–] [email protected] 39 points 1 month ago (1 children)

I am generally unwilling to pay extra for features I don't need and didn't ask for.

[–] [email protected] 5 points 1 month ago

raytracing is something I'd pay for even if unasked, assuming they meaningfully impact the quality and dont demand outlandish prices.
And they'd need to put it in unasked and cooperate with devs else it won't catch on quickly enough.
Remember Nvidia Ansel?

[–] [email protected] 5 points 1 month ago (1 children)

As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn't there then it's not supported.

[–] [email protected] 3 points 1 month ago

Assuming theres no catch-on after 3-4 cycles I'd say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.

Maybe it needs a bit of marturing and a re-introduction at a later point.

[–] [email protected] 11 points 1 month ago

I can't tell how good any of this stuff is because none of the language they're using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

load more comments
view more: next ›