I am not certain if this is an easily reproducible issue, but I believe posting it here and letting people test this themselves may shed some light.
I have noticed recently that when utilizing DirectX 11 Legacy and running a resolution scale of 200%, the GPU usage skyrockets. Ordinarily, using DirectX 11 Legacy mode would keep your GPU usage around 30-40%, at least from my testing, but increasing the resolution scale (which usually leads to an anti-aliasing effect at virtually no performance impact on a high-end machine) causes it to rapidly rise to 80-95% until it consumes all available GPU resources. This causes the temperature to rapidly increase as well, until the computer is forced to shut down.
Again, I am not certain if this is a reproducible issue. Detailed below are my specs in order for others to test and compare:
- Processor: AMD Ryzen 7 3700x
- Graphics Card: NVidia GeForce RTX 3080
- Driver Version: 516.94
To reproduce, and test:
- Change Graphics API to Direct X 11 Legacy in System → Advanced
- Set Resolution Scale to 200% in System → Graphics
- Measure GPU utilization in Task Manager, and/or temperatures in a separate program (I use Speccy)