Slow Game Performance RTX 3090 Graphics Card [Solved!]

Aha.

Part of the problem I have with my 7th gen, is that I actually do not get the performance I should out of the 3060. And I imagine that gets worse the higher I go.

Case in point:

My laptop.

Has an i5 11th gen.
RTX 3050Ti.

And it will crap all over my 7th gen i5 desktop with its 3060. Its not even a contest. It makes it look like my desktop is literally 2 generations older with the video card. In fact I ran a 1660, 2060 and a 3060 in this system and really saw only small improvements.

Both systems are running 32GB of memory. Both have nvme SSDs. But the laptop just straight up destroys my desktop in performance.

I built my son an 10th gen i3 this past summer for him to play games on. Installed a 970 (cheap card for a kid but it works well)

Plays every single game nice and fast. I daresay it can play D3 about the same as my desktop. With a lesser card.

Point is.

The newer cards I think take better advantage of newer CPU and memory designs whereas the older motherboards and CPUs actually create more bottle necks.

Games like D3 show this more than others due to their older codework.

But its only a slightly educated guess. But its not the first time I have heard of this sort of thing happening when someone installs a much newer card than the computer might properly support hardware wise.

Sure, it works, but will it work optimally?

Game on.

I can buy that argument for why it doesn’t work better.

But why is the 3090 worse performance?

For the reason I just pointed out.

Apparently your board and CPU gen is actually hurting you. Maybe not in everything, but apparently in some areas.

The other possibility is power limit. Do you actually have enough power to push a 3090 on your system?

That doesn’t make sense. CPU bottlenecking is a thing, and the game is most likely CPU bottlenecked.

But that doesn’t explain why a 3060ti outperforms a 3090 on the same system in Diablo III.

Power usage is 30% of TDP when playing Diablo III. I can get total system draw to over 700W while running other games like crysis remastered. Running Diablo III is 450W max. Power is not the issue.

As I said, D3 is an old engine. Even with the 64bit upgrade, its still a code mess. And the further the hardware gets from the age of the engine, the more it gets funky.

Have you actually tried changing the affinity of the CPU to assign just one core to the game and see if that changes anything? You would have to do this every time you launch it, but as a test?

Do you know how to do this?

Yeah I know how to do that, doesn’t change anything.

It has to do with how the RTX3090 is handling clocks. Generationally I’m sure the GTX1080ti outperforming these 30 series cards has to do with driver optimization that I will never be able to overcome, but the 3060ti beating the 3090 has to be the fact the 3090 will not boost while playing DIII without everything maxed. I’m sure I will run into this with other games as well.

Kinda sucks having a high end card when its too much for some games and won’t run them properly. Sad really.

My issue with this is that you have to run the card like your playing a game like Cyberpunk when only playing something like this. Stressing it when you shouldn’t have to.

Well that’s just it! You save up and get this kick *** piece of hardware to find it performs worse in the game you play a lot…

Maybe it’s a sign. I should finally stop playing diablo III and do something else. Doom Eternal at locked 144FPS is great!

Tried turning off the DLSS and RTX features? (basically turn off the newer eye candy enhancements) For giggles? Dumb down the card basically? You can custom profile settings for D3 of course.

If I had one of these, I would experiment. lol

My last 80 series card was a 980 Strix and its still in the 2nd gen i7 box I installed it in and running on Windows 7. LOL

I actually regret selling my GTX980. I liked that card. I’m keeping my GTX1080ti, I look at it as sorta the pinnacle of 2010’s and works with anything from that decade well.

I figured out a workaround on the boost clock speed.

So setting the nvidia setting to max performance for the Dialbo exe makes sure the card doesn’t “idle” but it still wont boost.

Using MSI Afterburner you can go into the voltage curve override section and lock the 3090 to 1.1V and 1980Mhz on the core.

Details can be found by googling “How to force max voltage & curve overclock with msi afterburner” and finding the EVGA forum post of that title.

Also can be done using the boost lock button in EVGA’s Precision X1 software.

Just make sure not to leave the card in this state, it keeps it at 1.1V and max boost clock no matter what you’re doing, which can be hard on the card. At best waste a bunch of electricity. Mine is liquid cooled so I’m not worried at all playing older games like this to keep FPS up.

Still only 50% power usage, but DIII runs 200-300 FPS again! Now it only dips below 144 FPS in very crowded maps.

EDIT: Also I just installed PTR and it runs quite a bit better than life. So I’m going to try reinstalling DIII as well.

Not a bad idea. D3 can get muddled after awhile and need an optimization of its files after a span of time and updates.

Good that you found a way around it, but now that you are doing that, are you going to go back to locking it to 144 rather than let it overrun? Having the game running FPS higher than your monitor refresh is still a waste of power. May not hurt to cap it, would probably tax the card even less while in that state.

just a thought.

Lower the HardwareClass ini setting in documents from the default 4, to 2 or 1 if you can handle it visually.
I personally play at 1 because I want as many frames as possible in the old nastily coded engine.

I know you said you don’t think it is, but what is your power mode set too?

While your other games are set to 700W, Diablo 3 is more CPU intensive than GPU intensive, and the other games might be drawing more of that from the GPU compared to Diablo 3. Perhaps Windows treats them differently in terms of your power plan.

Bunz, are you running Discord by chance? There is currently a known bug where Discord will actually lower your GPU’s clock speed by ~200 MHz.

Oh yeah, I forgot about that bug. I thought that only affected 4080 and 4090 though? Or was it both generations (30 and 40 series)?

It’s effectively all nVidia hardware. That’s why you had to go create an additional profile with some not so simple steps involving a (now ancient) SLI profile tool. For once the nVidia drivers aren’t at fault, though I wish they were since those can be cycled through to find a “good” one if need be. Can’t really do that here with Discord. Hopefully a Discord update comes out ASAP to fix this.

Nope.

High performance. It’s not a power issue.

The card draws 420-450W (500W power limit on the 3090 FTW3 Ultra) in other games (like Doom Eternal and Crysis Remastered), I can push the GPU 100MHz higher than stock no problem.

Without any tweaks DIII makes the card draw 140W, with setting performance max in Nvidia control panel that bumps up to 170W and then boost locking in EVGA Precision X1 (which actually gets the card to clock where it’s supposed to be in gaming) the draw is around 220W.

With a 3060ti the GPU utilization is high enough that boost clocks are engaged. With the 3090 the utilization is just simply not high enough and the card stays in an “idle” state without doing the aforementioned tweaks.

At least with Diablo III, and probably a number of other older titles.

Boost lock works good though, and I notice all the safeties are still in place. The GPU downclocks from various conditions like it’s supposed to. Will use more electricity though.

Only thing is I wish nvidia had this built into the drivers and you could set it based on application instead of having to manually do it. Accidentally leaving boost lock on is crappy. Card goes from ~40W idle to ~150W idle.

Nobody can afford the graphics cards anyway…

Many still play with graphics cards that are from 2017-2019… So so to speak 3 generations behind. GTX 10 series…

Blizz should not believe here that all the world now buys the latest generation 40s or XTX from Radeon.

Both moon prices. In addition, the first x-times eat more power and the Radeon cards have annoying coil beeping and lag behind in performance and ray tracing.

Even the last generation of Nvidia 30s, still costs the same moon price. The smallest variant RTX 3060 is as expensive as the current best cards from Nvidia should actually cost… Then it would be back on track.

When I jumped from an intel 6th gen to a 9th gen, my FPS felt smoother and overall higher.
I agree D3 is heavily CPU bound.
Jumping to Win 11 helped since older single threaded games are spread across multi-core CPU’s better.
Did you try changing your ini settings?