NVIDIA DLDSR and World of Warcraft

Nvidia recently made DLDSR available to users of Geforce RTX cards. This does the same thing as regular DSR, Super-sampling, or setting in-game “resolution scale” >100%, but it does it using AI algorithms using the onboard Tensor cores to dramatically increase the quality.

There is a lot of confusion going around about this now, because many have the idea that DLDSR can improve performance over regular DSR / >100% Resolution scale. In reality, for the same resolution scale / DSR factor, performance when using DLDSR is no better than using regular DSR - but the quality is better.

Because the quality of DLDSR is better than regular DSR or using the in-game resolution scale, in many cases you can get away with using a lower DLDSR factor.

So for example, while 2.25x DSR and 2.25x DLDSR perform exactly the same, 2.25x DLDSR will look better. Because it looks better, you might be able to get away with a lower DLDSR factor. If you were previously using 2.25x DSR, you might be able to use 1.78x DLDSR (a lower resolution) and have it provide the same quality or better than what 2.25x DSR was able to provide. Being able to potentially use a lower DSR / DLDSR factor (aka lower render resolution) because of the increased quality of DLDSR is where potential increased performance would come from.

My results using World of Warcraft so far have been very good. World of Warcraft has it’s own built-in Resolution Scale, which I usually run at 133% on my 1440p monitor (3413x1920) which is basically the same as 1.78X DSR. I tried 1.78x DLDSR (3413x1920) instead, leaving the in-game resolution scale at 100%, and performance was about the same as what I had before but it looked better. I then tried using a DLDSR factor of 2.25x (4K) but the performance hit was too noticeable. Then I tried experimenting a bit using the game’s built-in resolution scale in conjunction with DLDSR. With DLDSR set to 2.25x (4K), I set the in-game resolution scale to 67%, which means it’s actually rendering the game at my native resolution of 1440p while still using DLDSR, and there was almost no performance hit over native 1440p while still looking a whole lot better. I’m actually really surprised that using both together worked so well and gave me the option of being able to use DLDSR at my native res thereby maintaining performance.

The top picture link is native 1440p, no DLDSR, 100% in-game resolution scale.
The bottom picture link is 2.25x DLDSR (4K) combined with 67% in-game resolution scale (bringing resolution back down to 1440P).

Just look at how much easier it is to read the nameplates, and how much more clear the rock face in the distance is. GPU temps did seem to increase, I guess because the tensor cores are actually being used.

1 Like

How did you get DLSDR to work on WoW. It works for almost every other game i have, but with WoW not having a ‘fullscreen’ it doesn’t work, resolution is still stuck at 1440p

You’d have to adjust your desktop resolution, same way you would with any other game that lacks exclusive fullscreen modes. DLDSR actually works fairly well for general usage so the only real consideration to make is that you may want to enable scaling in Windows so everything doesn’t get smaller.

i got it to work with Dota, i imagine it can work if you make your desktop using the DSR super reso from the nvidia control resolution panel.

The new technology that constantly comes out from Nvidia and AMD excites me. I am so happy Blizzard keeps up to date with it and brings it into WoW.

Really cool.

i found that this feature works pretty good with World of warcraft. I can get the crispy looks of 4K reso UI and still render at 1440p (cuz my garbage card (1660 GTX)) Could maintain the same FPS while the game looking better overall. For the coding craze that is this game, i’m glad they have these settings.

Eh I dunno man. My 3070 runs at 4K and looks better native. I suppose it’s good stopgap til people get a more powerful monitor.

If you’re running below your native resolution you’re not using DLDSR. If you don’t have Tensor cores (ie Anything which doesn’t have the RTX designation) then you don’t even have the hardware to use DLDSR. And if you’re exclusively using an in-game scaling setting you’re not using DLDSR. You have to manually increase your desktop resolution above native in order to use DLDSR after having explicitly enabled the feature in the control panel.

With that said, check the FidelityFX (FSR) scaling option in the client. It’s not going to be as good as native, but it beats bilinear and bicubic hands-down - even at smaller scaling factors.

DLDSR, on it’s own, does not allow you to render below your native resolution, you are correct. But maybe you should have actually read the entire post.

The key here is that the in-game resolution scale is completely independent of NVIDIA DLDSR. Using one does not preclude, or restrict, use of the other, as once again, they are completely independent.

In my case, with my 1440p monitor, I’m running DLDSR 2.25x (4K), combined with 67% in-game resolution scale, to actually render the game at my native resolution (1440p). While this might seem counter-intuitive, it results in amazing anti-aliasing of transparent textures (such as nameplates) in ways that CMAA and FXAA can’t do, and better than MSAA can do, while making use of tensor cores.

I’ve been using it for a month at this point and it works fantastic. I have a RTX 2080.

2 Likes

Good grief, man… I did read your post. Now read mine, including the little reply indicator in the top right. Look at who I was replying to: the guy with the GTX 1660, who is using the in-game scaler to render 1440p onto a 2160p screen.

If you go back to January 27 you’ll see I already replied to someone else explaining how to get it to work in WoW. I know this, I never questioned that you didn’t (though you never actually outlined how to do it, hence the initial question), nor did I take from your original post that you were doing anything other than using the in-game scaling facilities.