Diablo IV Login/Queue Issues - 6/8/2023

It is not Blizzard’s fault.

At least in my case the issue was caused because my card’s failsafe for temps built in Gigabyte’s bios didn’t work as it should, resulting in the card overheating.

Thankfully, my card didn’t die.

It would have died for sure at some point if I hadn’t been monitoring my gpu temps regularly and kept playing games without knowing my card had been overheating.

It was just a matter of time till that would happen.

The culprit is most probably the Gigabyte bios for the RTX 3000 series at least partly, as it is known to cause fan issues in some cases, here is a reddit post from 3 years ago for reference:
https://www.reddit.com/r/gigabytegaming/comments/jv7bcr/gigabyte_rtx_3080_gaming_oc_fan_issues/

Gigabyte sadly hasn’t released an updated bios version that resolves the fan issues with the RTX 3000 series cards.

The solution to the problem is either applying a custom fan curve with MSI Afterburner before running a game, thus bypassing the Gigabyte bios fan curve (recommended way), or flashing a compatible BIOS for the same card from another vendor (eg ASUS) on your Gigabyte card (unorthodox way).

If you have a Gigabyte RTX 3000 series card and you’re facing random gpu overheating, that’s the solution to your problem, at least for now.

And ALWAYS, I mean ALWAYS monitor your cpu + gpu temps with MSI Afterburner while playing games.

It can save you a lot of time, money and frustration.

1 Like

Thanks for clearing things up for all those 3080Ti users, Anthony.

I’d like to mention again that I only used your video as a reference to my situation because it had showed at the time stamp (4:59), that the resolution was locked to 4k. I had noticed just before my PC went into meltdown that I was locked to 4k resolution as well. I had all my graphical settings set to 1440p prior to the patch going live. The patch seemed to have changed all my graphical settings to that of 4k, which had caused my GPU to overheat very quickly. It only took a few minutes for my GPU to burn up.

This would mean my (EVGA) Titan X SC would have had the correct BIOS, and the fan controller wouldn’t have been the culprit. This further gives me reason to believe it was the patch itself that caused my GPU to burn up.

Players shouldn’t have to go through all their graphical settings after every patch is deployed, since there are so many these days. I’m a 1440p gamer, and will always be a 1440p gamer.

Well done investigating! This post should be stickied.

1 Like

Always glad to help, If I can.

I see where you’re coming from, new patches shouldn’t mess with our graphics settings, and they definitely should not be locking our resolution to something other than what we have set ourselves. And as @DTMACe said:

That’s absolutely true, I even mention this in my video, where I say that this game shouldn’t be maxing out 12GB cards and Blizzard should definitely check the very high vram usage for example.

However, let’s not forget that technical issues like these are quite common in beta state and that’s why it’s called a beta, it’s the phase when the software is generally feature complete but likely to contain several known or unknown bugs.

That’s exactly why I said that the culprit is most probably the Gigabyte bios for the RTX 3000 series AT LEAST PARTLY, because the issue seems to trigger as a combinational result of several factors (possibly due to software conflict with some other program, eg rgb software, etc).

After all, the issue affected mostly Gigabyte RTX 3080 Ti cards, but NOT EXCLUSIVELY, other cards were affected as well, just not to the same extent.

And that is why I said that ALWAYS (and I mean ALWAYS) monitor your cpu + gpu temps with MSI Afterburner while playing games.

In your case, and in order to be able to play the game without issues and without worrying about possibly damaging your gpu, you should monitor your gpu vitals (temp, usage, frequency, fan speed) at all times with MSI Afterburner and set a custom fan curve before running the game.

Also, you should remove the Diablo IV Prefs file from the Diablo IV folder in the Documents location.

And yes, normally you shouldn’t have to do all this just to be able to play a game, but as I always say, if there is something on my end that I can do to save me time, money and frustration, I’ll definitely do it.

Better be safe than sorry. :wink:

1 Like

I usually always monitor my temps with RivaTuner. I was monitoring them in the first open beta and everything was working fine at 1440p. It was also working fine in the second beta as well, up until patch rolled out and locked me to 4k and nuked my GPU in a matter of minutes.

I think I will wait until ladders or seasons are added to the game before attempting to play again. At that point we can assume the game will be balanced, all the bugs will fixed, and the game will be out of beta. Also most importantly safe to play without worrying about it trashing expensive hardware.

Well, I guess it wouldn’t hurt to wait.

I hope that this whole thing comes to an end soon, even though I have a bad feeling that we haven’t seen the worst of it yet and I really really hope that I am wrong, but we do have a fact: so many, very expensive GPUs are getting bricked and some of them tend to fail way more frequently than others and someone has to take responsibility.

This won’t just go away on its own.

Sadly, a lot of people, even hardcore Diablo fans, will just stay away from D4, out of fear of damaging their hardware, which is a pitty really, because it is not the game’s fault that all these gpus are failing.

Personally I am 100% sure about that.

But till we have a definitive answer to this, like in the case of New World, where certain gpu manufacturers took the actual blame for the bricked gpus, till we know for sure what is happening exactly and who is primarily responsible for this and to what extent, who can really blame all this people for being extra cautious and trying to protect their expensive hardware?

Till then, here is my last, updated post that contains and explains everything that we know so far regarding the issue, in fact there is some new info towards the end of the post:

I wanted to update this.

I went ahead and installed MSI Afterburner for my MSI 3070 TI - X Gaming Trio.

Only by changing the fan curve to be a bit more aggressive, I managed to drop the temps at the hot spot from 170F down to 155F with prolonged play time, with all other variables the same.

Simply by increasing the fan speeds with a more aggressive curve. Talking about roughly 50% fan at 150F, instead of the less than 40% it was defaulted to. And it was topping out at around 80% above 180F!

So yeah. Bit more aggressive fan (this card has 3 fans), still relatively quiet and now the card is running cooler.

1 Like

Glad to know that your card is running better and cooler by applying a custom fan curve with MSI Afterburner. It is the way to go, not just for D4 but for all games actually. Good job, brother! :wink:

It’s unfortunate honestly. This game has been in development for so long. You’d expect it to be meshing well with all the systems/specs and or hardware out there. It might be that it was developed for consoles and ported over to PC, which might be causing some incompatibility issues and inconsistencies to a certain extent.

It’s cheaper for companies to develop games this way now. The future of PC gaming might be stake if developers keep taking this less complicated and cheaper route to save time and money.

I’m almost certain blizzard doesn’t do internal testing anymore. I think they leave it to the public to find and post any technical and or bug related problems, which is honestly bad for business. Ultimately the players have paid full price to play a beta for the coming weeks or even months.

Blizzard used to tell us ‘‘when it’s ready’’, when people asked about their next big title. It appears that phrase is a thing of the past, and they launch games with their fingers crossed.

I already have a definitive answer as to why my GPU melted so quickly. It was just simply locked to 4k along with all those new shiny 4k settings which my GPU just wasn’t designed for.

The EVGA Titan X SC was actually the first 4k ready GPU on the market. When i had bought it there were only 4k monitors @30hz available to the public. I tested it in 4k a few times, the frame rates weren’t the greatest, so i always stuck with 1440p.

I’m still looking forward to that e-mail from blizzard.

This has been stated otherwise by the devs themselves. I still don’t know why people have to continue to keep pushing that theory.

The correct statement would be, “I’m not sure if they test everything I think they should be testing.”

Bottom line, your post reads more like a rant than anything else.

And yet playing Diablo 4 maxes out any card’s VRAM because of a memory leak or whatever. So if this software can max out the GPU’s VRAM and keep the core cool the GPU fans most likely won’t speed up with stock fan profile therefore it’ll cause overheat and burn the GPU. That means software is perfectly capable of damaging hardware components. Prove me wrong.

Any card? All the time? Every system?

Are you sure?

You do realize you are contradicting yourself here?

It cannot keep the core cool and overheat. lol

I ran mine for about 2+ hours this afternoon, running a hardware monitor on the system in the background.

Things I noticed:

Voltage hit 1.081V
GPU usage hit a high of 98.0%
Mem usage hit 99.6%
Frame Buffer: 49.0%
Bus Interface: 7%

Temps were:
138.2 - GPU
161.6 - Memory
158.7 - Hotspot
Fans were about 1850 at the high end

Power draw:
GPU: 278.56W
Core PS: 457.14W
PCIe 12+: 43.95W
8Pin 0: 133.47W
8Pin 1: 110.50W

Clock speeds:
Graphics: 1950Mhz
Memory: 9502Mhz
Video: 1710Mhz

And it ran smooth, stable, FPS at 138, no issues.

Pretty happy with this card.

I’ve tried the game on 3 different GPUs in the past 2 days.
Gigabyte 3080ti 12GB which got bricked. Vram maxed
Asus 3070ti 8GB . Vram Maxed
MSI RTX 4080 16GB . Vram Maxed…
It pretty much looks like it maxes out the Vram on every GPU if you use ultra textures.
I wouldn’t say i’m contradicting myself. for example MSI Afterburner’s fan profile speed depends on GPU Core Temperature. However the core on the RTX 4080 is being used at about 30% load when i’ve capped the game to 60FPS (I always play at 60 FPS never more) But the Vram is maxed out at 16GB (100% Load) . So the VRam would get hot as hell but the fans won’t crank up because the core stays cool.

Now, I have the high res packs installed on this machine, but I am not running Ultra settings. Despite that however, even I show the memory on mine is maxed out and running hotter than the rest of the card, though not hot enough to kill it yet.

I am in fact running the settings the game defaultedly assigned at launch with the Pref’s file.

I made sure there were no old files to pull and cause an issue, and let the game create one.

After that, I only capped the FPS to 150. I have not made any other adjustments to the graphical settings in game.

I should note, that while I could run this on a 4K screen, I don’t. I normally game on a 24" Acer Predator monitor with Gsync, as I like its response better. The 4K display I have is limited to 60FPS. (its a TV)

I plan to upgrade part way to a 1440P monitor later this year. Just don’t have a need for it right now.

A GPU might not be designed for a certain resolution, but this means that the gpu will simply perform badly at that resolution, it won’t melt, or at least it shouldn’t. A healthy, well-functioning gpu just won’t melt. It will just suck at that res.

If a gpu is overheating, then it needs cleaning or it is starting to fail/degrade.

Software can’t brick a gpu, unless it is specifically designed to have direct control of things like voltages, frequencies and fan curves, that is technical stuff (eg MSI Afterburner).

D4 is not designed in such a way, it is a game and it does not mess directly with that kind of stuff, it just applies a load on the gpu. If a gpu is overheating, then it just needs cleaning, or this is due to bad driver support, faulty bios or defective hardware.

I agree on everything else you have said. These days most games are developed for consoles and ported hastily over to PC and that’s why most new games are so unoptimized.

Also, as you said yourself, sadly many game developers leave testing to the public.

The players pay full price for a beta and they unwillingly become beta testers for the studio, but free of charge. A business practice adopted by most major game studios in 2023.

Great times. :slight_smile:

Yeah overheating on the core is not the issue here. Something that came from D4 made my RTX 3080ti like the rest of the Gigabyte 3080s to melt. The card was clean and 10 months old. I’ always play with pretty aggresive fan profiles but truth is that in D4 the card never had more than 50-60% Load at any time (Again because of 60 FPS cap). I had never encountered a game that will crash my PC and make the GPU fans spin at 100% when crashed. I would say the game is not tested and not well optimized and this is where all the issues are coming from. The memory leak is the culprit.
Btw what are you using to monitor GPU Vram temperatures ? i can’t see those in MSI Afterburner at least for the 4080.

So I’d imagine you are one of those guys that believes everything that anyone tells you. That explains a lot.

I apologize for my words, it appears you are a sensitive person among other things.
If you’d like to police some threads, go check out the WoW forums. You will fit in perfectly there.

Its not the best monitoring solution (as it can have errors sometimes) but for those readings I was using HWMonitor.

Most other tools don’t always show all the individual stats like that one, but as I said, sometimes on some setups it can bug on a sensor or two. Usually though its not too far off.

I even monitor the PSU. The Corsair AX1200i comes with a USB connection and monitoring software. I can see individually every single rail and voltage, along with amps and wattage, temps, fan speed, set fan curve for it, even has graphing capability.

And my system pulls over 500W on the AC side of the PSU when playing this game. But its an i9 12900k, Full ATX board, AIO 360, DDR5, and the 3070Ti. And that draw includes the Predator monitor too.

No, its not that. Just that I hear that drum a lot. Gets old after awhile.

Actually I’m not. Maybe just getting tired of snark. Been a lot of that on the forum lately.

Oh hell no! There ain’t no way I’m going over there! lol

Sorry if I was a bit over the top. Just been a long couple days of seeing people have legitimate gripes and a lot more just whining over little to nothing, you know?

I guess I should be more understanding. I mean, its a $1000+ video card. I’m sure I would be pissed too.

Game on.

1 Like

I fully dusted and cleaned my PC/GPU a few days before the first open beta, so that was not the issue my friend. It was the fact that after the patch, I was unexpectedly locked to 4k after changing all my graphical settings to 1440p prior to the ‘‘server slam’’ patch. This put a ‘‘load’’ on my GPU so great that it in did in fact cause it to literally melt. I explained this many times throughout this thread.

Players shouldn’t have to reassign all their graphical settings after every single patch is deployed. Every open and closed beta i have ever participated in since the late 90’s has always saved the players graphical settings, regardless of there being a patch deployed or not. It’s development 101.

Even though technically speaking D4 didn’t directly kill my card, but it did indirectly cause the card to burn up. Blizzard is still responsible for D4 indirectly killing my GPU.

"UPDATE 2 (June 3, 2023):
After extensive testing, we have found that the issue is related to a fan controller malfunction resulting in the card overheating. Lately we’ve had this behavior occuring randomly in other games as well, which means that it has nothing to do with Diablo 4.

This is not Blizzard’s fault."

Honestly hilarious that you claim the other person is talking nonsense when that original video confirms what he’s saying. Maybe take a peek at the comments section next time? Lmao.