Rumours started flying around the web after expensive AMD gear started to pack a sad whenever it played the game that something had gone wrong between the testing and the launch of the game. Sure enough when someone peaked under the bonnet every single game-related INI file contained the following: “bDisablePhysXHardwareSupport=False.”
This means that the AMD cards were being forced to try and play Nvidia’s PhysX which is a bit like giving me an iPhone and expecting me to use it.
Extreme Tech had a look under the bonnet using a Windows performance monitoring tool, Perfmon and saw if it made a difference. What was weird though was that the CPU was spiking.
It did make a 14 per cent but that is nowhere near enough to explain the ton of suck which Gears of War players saw. However when it is configured like that the GTX 980 Ti’s frame rate is roughly double that of the R9 Fury X (we benchmarked with ambient occlusion disabled, since that mode causes continual rendering errors on the AMD platform) and the CPU use doesn’t keep spiking the way it does with the AMD cards.
Nvidia’s own website says nothing about PhysX. Given the age of this version of the Unreal 3 engine, it’s possible that this is a variable left over from when Ageia owned the PhysX API; Unreal 3 was the first game engine to feature Ageia support for hardware physics.
Extreme Tech said that all it had was evidence that the CPU usage pattern for the AMD GPU is different than the NV GPU. There’s no clear evidence that PhysX is causing this problem, but the game runs unacceptably on Radeon hardware and this might be a factor.