this post was submitted on 31 Jan 2024
501 points (97.0% liked)

Technology

58159 readers
3680 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 15 points 7 months ago* (last edited 7 months ago) (1 children)

$US330 for the top 8700G APU with12 RDNA 3 compute units (compare to 32 RDNA 3 CUs in the Radeon RX7600). And it only draws 88W at peak load and can be passively cooled (or overclocked).

$US230 for the 8600G with 8 RDNA 3 CUs. Falls about 10-15% short of 8700G performance in games, but a much bigger spread in CPU (Tom's Hardware benchmarks) so I'm pretty meh on that one.

Given the higher costs for AM5 boards and DDR5 RAM, you could spend about the same or $100-200 more than an 8700G build you could combine a cheaper CPU and better GPU and get way more bang for your buck. But I see the 8700G being an solid option for gamers on a budget, or parents wanting to build younger kids their first cheap-but-effective PC.

I also see this as a lazy mans solution to building small form factor mini-ITX Home Theatre PCs that run silent and don't need a separate GPU to receive 4K live streams. I'm exactly in this boat right now where I literally don't wanna fiddle with cramming a GPU into some tiny box, but also don't want some piece of crap iGPU in case I use the HTPC for some light gaming from time to time.

[–] [email protected] 5 points 7 months ago

itll be a great upgrade for these little nuc like things , thin laptops, and steamdeck competitors

[–] [email protected] 22 points 7 months ago (1 children)

Mind you that it can get these frame rates at the low setting. While this is pretty damn impressive for a APU, it's still a very niche market type of APU at this point and I don't see this getting all that much traction myself.

[–] [email protected] 5 points 7 months ago (2 children)

I think the opposite is true. Discrete graphics cards are on the way out, SoCs are the future. There are just too many disadvantages to having a discrete GPU and CPU each with it’s own RAM. We’ll see SoCs catch up and eventually overtake PCs with discrete components. Especially with the growth of AI applications.

[–] [email protected] 2 points 7 months ago (1 children)

People will be building dedicated AI PCs.

[–] [email protected] 2 points 7 months ago

They may build dedicated PCs for training, but those models will be used everywhere. All computers will need to have hardware capable of fast inference on large models.

[–] [email protected] 1 points 7 months ago (1 children)

I agree, especially with the prices of graphics card being what they are. The 8700G can also fit in a significantly smaller case.

[–] [email protected] 3 points 7 months ago

Unified memory is also huge for performance of AI tasks. Especially with more specialized accelerators being integrated into SoCs. CPU, GPU, Neural Engine, Video encoder/decoders, they can all access the same RAM with zero overhead. You can decode a video, have the GPU preprocess the image, then feed it to the neural engine for whatever kind of ML task, not limited by the low bandwidth of the PCIe bus or any latency due to copying data back and forth.

My predictions: Nvidia is going to focus more and more on the high-end AI market with dedicated AI hardware while losing interest in the consumer market. AMD already has APUs, they will do the next logical step and move towards full SoCs. Apple is already in that market, and seems to be getting serious about their GPUs, I expect big improvement there in the coming years. No clue what Intel is up to though.

[–] [email protected] 29 points 7 months ago

For people like me who game once a month, and mostly stupid little game, this is great news. I bet many people could use that, it would reduce demand for graphic card and allow those who want them to buy cheaper.

[–] [email protected] 30 points 7 months ago (1 children)

Oh, oh ok I thought one of the new Threadrippers is so powerful that the CPU can do all those graphics in Software.

[–] [email protected] 24 points 7 months ago (1 children)

It's gonna take decades to be able to render 1080p CP2077 at an acceptable frame rate with just software rendering.

[–] [email protected] 0 points 7 months ago (1 children)

It's all software, even the stuff on the graphics cards. Those are the rasterisers, shaders and so on. In fact the graphics cards are extremely good at running these simple (relatively) programs in an absolutely staggering number of threads at the same time, and this has been taken advantage of by both bitcoin mining and also neural net algorithms like GPT and Llama.

[–] [email protected] 6 points 7 months ago

It's a shame you're being downvoted; you're not wrong. Fixed-function pipelines haven't been a thing for a long time, and shaders are software.

I still wouldn't expect a threadripper to pull off software rendering a modern game like Cyberpunk, though. Graphics cards have a ton of dedicated hardware for things like texture decoding or ray tracing, and CPUs would need to waste even more cycles to do those in software.

load more comments
view more: next ›