Ummm...ray tracing was invented for CPUs long before GPUs even existed.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
Before The Sims, I used to play with this program that was meant for architects designing homes, but 9-year-old me treated it like a game. It had a ray tracing button that when pressed, would render a ray-traced view of the room you were in.
It took at least a couple of minutes on my dad's Windows 95 PC to render a single frame, and if you moved the camera at all, the view would switch back to raster and you had to re-render the scene all over again. But it was real ray tracing, performed on the CPU, in the 90s.
how is this news. They're called "general purpose processors". You can literally run anything on them. It is even mathematically proven that they can do the job. This has been known for about a century now.
Heck, most raytracers run on the cpu...
"Step aside, AI, people can now draw with a pencil."
"Pixar is so dead when we release coloring pens!"
Aaaaand we’ve gone full circle.
I can't wait until someone ports this to the GPU to speed it up!
Or, and hear me out here: maybe join TWO gpus together for more performance!
More is always better. Why not a large rack of servers working individually on portions of the scene and individual frames of content?
Sounds awesome. And maybe you could make use of a gpu or two in each of those servers when you’re not using them to mine crypto?
Nice, now we just need 80 CPUs and we won't need a GPU anymore!
Perfect! We'd have pretty low utilization on those 80 CPUs, though -- if we made them smaller, the power draw would be lower and it would be cheaper. We could then get away with adding more CPUs. It would then make sense to put the array of simple CPUs on its own card, dedicated to graphics processing... wait a minute.
Maybe we could use a single huge heatsink to cool it off! we could even use 3 fans instead of 1!
Not good for real-time rendering, but it still has potential for rendering 3D still scenes or frames of a video, or a small studio might have those 80 CPU's in a render-farm and not need to worry about supply-issues for GPU'S