So glad I refunded this game during the beta. The game was poorly reworked. It should have been reworked from scratch with a completely new gaming engine.
I’m seeing a whole lot of buyers remorse on the forums.
So glad I refunded this game during the beta. The game was poorly reworked. It should have been reworked from scratch with a completely new gaming engine.
I’m seeing a whole lot of buyers remorse on the forums.
Well it was fun while it lasted. For some reason now my GPU is back up to 100% utilization and over 80 degrees Celsius. It was fine before - hovering around 70% utilization and 65 degrees Celcius.
The latest update must have broken something.
Limit your framerate, the 2xxx and 3xxx have heat issues anyway, and D2 will just PUSH and PUSH if you dont cap your fps…
I did. It’s capped at 60 fps
Also it’s interesting that D3 pushes it to about 90% utilization (sometimes as high as 98%) but the temps never go past 55 degrees Celsius. It’s cool as a cucumber.
So it’s not so much utilization but the fact that the temps go so high. There’s something real buggy how this game was optimized.
Is it a good thing if it is using close to 100%? Visuals unsure if it justify it, but on my end i do have a very good case with great airflow, so mine sit around 65C~ which is normal.
My main issue now is with my upgrade to alder lake, i have to disable all E-cores to prevent the game from stuttering/crashing
Of course. After all, you are paying for the performance. Unless you limit framerates, the games will try to pump out as many frames as possible, limited by either the CPU/core, or the GPU. One of those will be the limiting factor. As long as you have good cooling, running either/both at 100% is fine.
When idle, it uses 100% of the gpu. I think they are digging bitcoins to make more money off us. 
Of course. After all, you are paying for the performance. Unless you limit framerates, the games will try to pump out as many frames as possible, limited by either the CPU/core, or the GPU. One of those will be the limiting factor. As long as you have good cooling, running either/both at 100% is fine.
Agreed and my concern is more about the temperatures than the utilization %. Like I said, Diablo 3 is using similar utilization rate but running 30 degrees cooler. Like how does that happen?
Because D3 isn’t as graphically challenging as D2R. It is CPU-bound, or more accurately core bound, since it runs mainly on a single core. As a result, the GPU doesn’t run 100% unless you have an extremely fast CPU and a slow GPU.
As far as temps go, that will depend on your cooling system. My GPU runs at around 55-60C at full load.
Yeah my temps were around 60 degrees too before the most recent update. After the most recent update it shot back up to over 80 degrees like it was when the game first came out. I checked all the settings again, nothing changed. So I have to assume the update messed something up.
Edit: It gets even better since I turned off the Hardware Accelerated GPU scheduling the utilization percentage is 25% but GPU temps are still over 80 C degrees. LOL this is such a poorly optimized game.
I reach over 90 temps in this game only as well. It’s insane how hot d2r runs.
Just wanted to provide an update. I tried turning off the Hardware Accelerated GPU scheduling and now the GPU utilization percentage is 25% but GPU temps are still over 80 C degrees.
What is going on? I have never in my life encountered such weird behaviour from one game on any system, be it on PC or console.
The original game was optimized for the Glide API. If a system didn’t have a Glide based card, like the old Voodoo cards, it offloaded the graphics to direct draw on the CPU. As a result, the CPU would run warm.
I used to run the original D2 with Sven’s Glide wrapper and that solved a lot of problems with the game using the CPU for direct draw. The game then used the power of the graphics card and converting it into a Glide format. In actual game play, it ran very cool both on the CPU and the GPU. But once in the lobby, the CPU would crank up again.
I’m going to take a guess, this is the fundamental issue with this “re-master”. Whoever was in-charge of this mess completely bypassed redoing the old Glide code and thought they could simply upscale the graphics into Open GL.
WRONG!
This is a software architectural issue and will never be fixed. Have fun roasting your GPUs.
Have you tried disabling the Nvidia in-game overlay (GeForce Experience → Settings)?
That’s interesting analysis. I mean under 85 degrees C is still within safety limits. Obviously still not ideal and it just bugs me that a game like this makes the GPU work so hard. I don’t expect this from a AAA company.
It fits for what is actually being observed. 85C is seriously overkill on a GPU. At those temps, it won’t last long, guaranteed.
I have air cooled 1060GT and a water cooled i7 7700. D2R is the only game that made my little ITX box into a room heater. PS is a Corsair 750w double rail 80% efficiency rated.
No other game I play pushes my CPU over 50c, not one. No other game pushes my GPU over 55C, not one. Most of the time, my CPU, while gaming, runs between 35C and 45C. Same with the GPU.
Test the old D2 using Sven’s Glide Wrapper, you’ll see clearly what I’m talking about.
It’s almost as if the Glide API was completely overlooked, so the game defaults to Direct Draw, and that really crappy API is upscaled into Open GL. Seriously bad, won’t be fixed, ever.
People are still buying it literally and figuratively.
I knew as soon as I installed the beta and played for 10 mins something was seriously wrong with the code.
I can only imaging what D4 will be like.
I have an update for everyone. I can’t believe I didn’t think of this before but I just tried to run D2R on my 2nd monitor and guess what? The GPU didn’t go crazy, in fact the fans didn’t even come on. So it’s somehow monitor related.
My monitors are as follows:
edit: and with that I solved it. I theorized that the upscaling to 4k resolution is causing the GPU to go crazy with no effect on actual graphics quality. So I ran my 4K monitor on 1920x1080 and yep I was right, now the GPU functions perfectly.
Maybe it’s like that for others as well…don’t run the game in 4K resolution.
Yea and make sure your ethernet cord and power cord are correctly up to date and plugged in…
Okay ill bite. Glide API was the old video interface in which d2 was ran? Was it like a old video card driver or something?
My point is something could be done driver wise to solve this issue? Which ultimately would be up to Nividia / AMD to fix or no?
sadly this is a DX12 issue… D2R needs to add DX11 and Vulcan to the game. don’t know why this isn’t a thing.
DX12 has a real knack for over utilising the gpu with god awful temps.