Nvidia got what it wanted from Ageia when they bought PhysX, and that was improvements to CUDA.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I've had enough of NVIDIA to the point I'm not planning on playing anything on one of their GPUs ever again.
Lol keep buying Nvidia!
Are there really any 32-bit era games that your CPU can't handle, especially if you have a $1k+ gpu? This post is honestly pretty misleading as it implies modern versions of PhysX don't work, when they actually do.
That being said, it doesn't make all that much sense as a decision, doubles are rare in most GPU code anyways (as they are very slow), NVIDIA is just being lazy and doesn't want to write the drivers for that
Well, at least you aren't on mac where 32 bit things just don't launch at all... (I think they might be playable through wine, but even in the x86 era MacOS didn't natively run any 32 bit games or software, so games like Portal 2 or TF2 for example just didn't work even though they had a MacOS version)
mirrors edge drops to under 10 fps when breaking glass which generates physx objects... with a 9800x3d.
the current physx cpu implementation is artificially shit, the cpu can easily handle it nowadays but it depends on skilled community members or nvidia themselves to unshit it.
Hmm, I was not aware of that. I've seen (not Nvidia related) simulations with probably tens of thousands of rigidbodies running on relatively old midrange CPUs in real time, so it's pretty crazy that it's that slow.
nVidia doesn't really have that many successful unshits, historically speaking, do they?
The enshittification of green has begun
They laser off the vcpu feature from the chip just so you can't use it at the same time as another family member. They spend extra money to make it worse.
wdym?
Wow. I probably have played 4 or 5 on that entire list. And none of them in the past 5 or so years.
It's still a shitty thing to do for sure. Maybe there will be a new "thing" that starts getting used instead? Ray tracing has gotten way more coverage than PhysX ever did, and imo is like 3% as good or interesting.
Physics actually have gameplay interactions that matter. Ray tracing looks nice, but is so absolutely expensive computationally that (imo) is not even CLOSE to being worth the effort if turning on, even with compatible hardware.
Give us better physics, games! My main time sink rn is Rocket League, and that game is literally nothing but physics. Mostly simple physics, but stuff behaving in a logical way makes my brain a lot happier than better lighting ever did.
I like when y'all grass became an actual object that could be moved around by players, or when tossing an item on the ground actually does it tossed down and colliding with other objects while texting to them appropriately (as in fire starting, or weight holding something down a certain amount). That stuff is potentially game creating, definitely feature drinking.
Has anything AT ALL been affected by "pretty lights" beyond making them pretty? If it has, I've never heard of it.
Keep games about a gameplay experience, not just a visual feast. Save that tech for movies or playable stories (ie Telltale type). Focus only on the gameplay experience otherwise. Toss in some ray tracing when you can, but NEVER at the expense of physics. It just doesn't make any sense.
I actually wasn't, no, planning to ride this 30 series out for about a decade.
It's too bad the CPU path for PhysX is crappy. It would be a good use of the many cores/threads we have available to us these days.
My understanding is 32-bit PhysX games are broken.
64-bit compiled games are fine.
No, the card is broken. Only suitable for newer games.
I'm too poor to worry about this. My wife bought eggs recently
So you had an egg in these trying times, did you?
DECEARING EGG
My wife had to start laying her own.
I'm so sorry you needed eggs
The eggs have insane physics reactions though. So I got that going for me.
Guess I'll have to dust off my old dedicated PhysX card from the mid 2000's. Shit... I think that thing is AGP not even PCIE 🤔
PhysX cards were PCI. Motherboards only had one AGP slot and that was for the GPU.
It only ever got deployed in a few dozen games
Is the only sentence in the entire article you need to be aware of.
This is rage-bait.
This is a list of the games it affects:
- Monster Madness: Battle for Suburbia
- Tom Clancy's Ghost Recon Advanced Warfighter 2
- Crazy Machines 2
- Unreal Tournament 3
- Warmonger: Operation Downtown Destruction
- Hot Dance Party
- QQ Dance
- Hot Dance Party II
- Sacred 2: Fallen Angel
- Cryostasis: Sleep of Reason
- Mirror's Edge
- Armageddon Riders
- Darkest of Days
- Batman: Arkham Asylum
- Sacred 2: Ice & Blood
- Shattered Horizon
- Star Trek DAC
- Metro 2033
- Dark Void
- Blur
- Mafia II
- Hydrophobia: Prophecy
- Jianxia 3
- Alice: Madness Returns
- MStar
- Batman: Arkham City
- 7554
- Depth Hunter
- Deep Black
- Gas Guzzlers: Combat Carnage
- The Secret World
- Continent of the Ninth (C9)
- Borderlands 2
- Passion Leads Army
- QQ Dance 2
- Star Trek
- Mars: War Logs
- Metro: Last Light
- Rise of the Triad
- The Bureau: XCOM Declassified
- Batman: Arkham Origins
- Assassin's Creed IV: Black Flag
- Borderlands: The Pre-Sequel
I played Mirrors Edge a bit. The only part of physx in the game that I remember, as i didn't finish it, was that there were some random curtains that would blow in the wind and weren't placed anywhere where they would actually matter
Mirror's Edge actually had a place with tons of broken glass falling down, where the framerate would drop into the single digits if it used CPU PhysX. I remember that because it shipped with an outdated PhysX library that would run on the CPU even though I had an Nvidia GPU, so I had to delete the game's PhysX library to force it to use the version from the graphics driver, in order to get it to playable performance. If you didn't have an Nvidia driver you would need to disable PhysX for that segment to be playable.
Ah, the good old days 😂 having to manually fix drivers but with limited help from the internet
I disagree; people on the internet were a lot more helpful back then. These days it's difficult to get people to care about anything, let alone compel them to help.
The only part of physx in that game that I remember is that it used to cause massive performance and stability issues.
That list has some incredibly popular games on it... Hardly rage bait if you'll get worse performance in the greatest AC game to have come out.
CPU accelerated physics were severeley dumbed down to make PhysX look better and there are several high profile games on that list that will forever have physics stupidified because of corporate BS back then that affects them now.
I play several of those games
this is an incomplete list. as per the wiki article:
PhysX in Video Games
PhysX technology is used by game engines such as Unreal Engine (version 3 onwards), Unity, Gamebryo, Vision (version 6 onwards), Instinct Engine, Panda3D, Diesel, Torque, HeroEngine, and BigWorld.
As one of the handful of major physics engines, it is used in many games, such as The Witcher 3: Wild Hunt, Warframe, Killing Floor 2, Fallout 4, Batman: Arkham Knight, Planetside 2, and Borderlands 2. Most of these games use the CPU to process the physics simulations.
Video games with optional support for hardware-accelerated PhysX often include additional effects such as tearable cloth, dynamic smoke, or simulated particle debris.
PhysX in Other Software
Other software with PhysX support includes:
- Active Worlds (AW), a 3D virtual reality platform with its client running on Windows
- Amazon Lumberyard, a 3D game development engine developed by Amazon
- Autodesk 3ds Max, Autodesk Maya, and Autodesk Softimage, computer animation suites
- DarkBASIC Professional (with DarkPHYSICS upgrade), a programming language targeted at game development
- DX Studio, an integrated development environment for creating interactive 3D graphics
- ForgeLight, a game engine developed by the former Sony Online Entertainment
- Futuremark's 3DMark06 and Vantage benchmarking tools
- Microsoft Robotics Studio, an environment for robot control and simulation
- Nvidia's SuperSonic Sled and Raging Rapids Ride, technology demos
- OGRE (via the NxOgre wrapper), an open source rendering engine
- The Physics Abstraction Layer, a physical simulation API abstraction system (it provides COLLADA and Scythe Physics Editor support for PhysX)
- Rayfire, a plug-in for Autodesk 3ds Max that allows fracturing and other physics simulations
- The Physics Engine Evaluation Lab, a tool designed to evaluate, compare, and benchmark physics engines
- Unreal Engine game development software by Epic Games. Unreal Engine 4.26 and onwards has officially deprecated PhysX.
- Unity by Unity ApS. Unity's Data-Oriented Technology Stack does not use PhysX.
That's misleading in the other direction, though, as PhysX is really two things, a regular boring CPU-side physics library (just like Havok, Jolt and Bullet), and the GPU-accelerated physics library which only does a few things, but does them faster. Most things that use PhysX just use the CPU-side part and won't notice or care if the GPU changes. A few things use the GPU-accelerated part, but the overwhelming majority of those use it for optional extra features that only work on Nvidia cards, and instead of running the same effects on the CPU if there's no Nvidia card available, they just skip them, so it's not the end of the world to leave them disabled on the 5000-series.
Yeah, and a great post too - because some of your points here just point out that everyone ELSE have deprecated PhysX as well. Unity and Unreal both dropped it long ago. It's basically a moot point for 99.9% of people playing games.
Instead of using a PPU on the GPU, most people have focused on GPGPU physics calculations instead. The idea behind PhysX was a difficult one to launch in the first place. Given that most chip real-estate is going to these VPUs, I'm not surprised at all that they ditched the PPU for a more generalized version.
I don't think there has ever been a PPU on the GPU. It did originally run on PPU cards by Ageia, but AFAIK PhysX on GPU:s used CUDA GPGPU right from the start.
PhysX has just been a CUDA application for a long time, there's not been a dedicated PPU on any card in a very long time
well, sorta. some engines like unreal have indeed dropped physx (in fact that's the only one that's in there as having dropped it), but there are some heavy hitters in there. unity did not drop it as far as i know, but they have a separate version without it that's not made for games.
i also happen to know that ARMA 3, which is not on the list, is a heavy physx user. so i don't know how accurate any of our lists actually are.
my takeaway from this list is that if nvidia follows suit with their AX series and other pro cards, they are going to lose significant market share with the CAD and CFD crowd, because those guys have 40 year old codebases and they are not going to be happy that they have to rewrite a subsystem.
The more you buy the more you save