this post was submitted on 08 Mar 2024
75 points (94.1% liked)

PC Gaming

8533 readers
1003 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
all 15 comments
sorted by: hot top controversial new old
[–] [email protected] 16 points 8 months ago (1 children)

Ummm...ray tracing was invented for CPUs long before GPUs even existed.

[–] [email protected] 6 points 8 months ago

Before The Sims, I used to play with this program that was meant for architects designing homes, but 9-year-old me treated it like a game. It had a ray tracing button that when pressed, would render a ray-traced view of the room you were in.

It took at least a couple of minutes on my dad's Windows 95 PC to render a single frame, and if you moved the camera at all, the view would switch back to raster and you had to re-render the scene all over again. But it was real ray tracing, performed on the CPU, in the 90s.

[–] [email protected] 16 points 8 months ago

how is this news. They're called "general purpose processors". You can literally run anything on them. It is even mathematically proven that they can do the job. This has been known for about a century now.

Heck, most raytracers run on the cpu...

[–] [email protected] 12 points 8 months ago (1 children)

"Step aside, AI, people can now draw with a pencil."

[–] [email protected] 9 points 8 months ago

"Pixar is so dead when we release coloring pens!"

[–] [email protected] 18 points 8 months ago (1 children)

Aaaaand we’ve gone full circle.

[–] [email protected] 10 points 8 months ago (1 children)

I can't wait until someone ports this to the GPU to speed it up!

[–] [email protected] 3 points 8 months ago (1 children)

Or, and hear me out here: maybe join TWO gpus together for more performance!

[–] [email protected] 3 points 8 months ago (1 children)

More is always better. Why not a large rack of servers working individually on portions of the scene and individual frames of content?

[–] [email protected] 3 points 8 months ago

Sounds awesome. And maybe you could make use of a gpu or two in each of those servers when you’re not using them to mine crypto?

[–] [email protected] 34 points 8 months ago (3 children)

Nice, now we just need 80 CPUs and we won't need a GPU anymore!

[–] [email protected] 14 points 8 months ago (1 children)

Perfect! We'd have pretty low utilization on those 80 CPUs, though -- if we made them smaller, the power draw would be lower and it would be cheaper. We could then get away with adding more CPUs. It would then make sense to put the array of simple CPUs on its own card, dedicated to graphics processing... wait a minute.

[–] [email protected] 6 points 8 months ago

Maybe we could use a single huge heatsink to cool it off! we could even use 3 fans instead of 1!

[–] [email protected] 4 points 8 months ago

Not good for real-time rendering, but it still has potential for rendering 3D still scenes or frames of a video, or a small studio might have those 80 CPU's in a render-farm and not need to worry about supply-issues for GPU'S