Ray tracing not running well

My game runs fine at max settings (minus ultra texture cause not 4k) at 160 fps on my 1440 165 hz monitor. However when I turn on raytracing even at its lowest setting, everything becomes a blurry mess. Super noticeable on text showing on screen like items or player names.

I would think my system is strong enough to run the game on max settings, but could I be wrong? What am I doing wrong or have set wrong?

Win 11
AMD 5800X
32g memory running OC def profile at 3600
4090 RTX

Thank you for the help

Since they disabled Frame Gen in game, enabling Ray Tracing, even on a 4090 will cause you to see low FPS.

I have a 4080 Super FE, and had to turn off Foliage and effects RT, and set the sliders to low for the first two. But I’m still using DLSS at DLAA at 100FPS on 1440P res.

Its playable for now though. But before they took the frame gen offline, I was turning 100FPS easier and had every setting to max.

Ugh.

Nah not for me. I decided to turn RT completely off and enjoy the game not having any stutters or FPS drops anymore at a locked 138 (Reflex on) :slight_smile:

Sounds like it’s not even possible to run it properly? Lol if a 4090 cannot do it… What can? Is it cpu bound?

I think the Ray Tracing has some issues with how it interacts with things.

If its CPU dependent, then an i9 processor and DDR5 isn’t enough to make it work well either. (12900K here)

And with a top tier GPU not cutting the mustard either, I’m putting the blame directly on code work and implementation, whether it be Blizzard or nVidia’s fault who’s to say.

I will say this:

I played for over 2 hours last night no issues at all. Played great, smooth, no problems. The only settings I have lowered are the RT I mentioned above. Everything else is maxed/Ultra, and it stayed at 100FPS for 1440p just fine.

Even when playing with multiple players on boss fights, in towns, etc. No issues.

I could literally see other players jumping in time on the map like they were having lag, but nothing on my end indicated I was having any issues, no lag, latency was sub 70s.

It depends, in some areas like big cities you are very CPU-limited and the GPU-load drops.

No current CPU can do high refreshrates with RT on in D4 without the help of FrameGen.

Maybe a 9800X3D will barely be able to, but i doubt it.

Blizzard’s Raytracing implementations so far (WoW and D4) have both been very CPU-limited and bad in general.
I’m not even using RT shadows in WoW with my 4090 because it just increases the already very big CPU-limit in places with many players or even in raids.
Also there is no FG to help.
But even then the frametimes spike all the time, it’s just not a good experience.

For example at a mass player event like the Superbloom in WoW DF, with RT shadows on, i can drop to the 30s with a 5800X3D and 4090 with the GPU being almost idle because CPU-limited :smiley: It’s just ridiculous.
Turning RT off in that scenario increased the FPS by 10-20 (GPU still bored to death).

The only problem I have with this issue being tied to the CPU is there doesn’t seem to be much load on the CPU either. At least in my case, I never hear it ramp up or heat up and the percentage of usage/power consumption isn’t all that high when monitoring it.

So… Not sure about that either. I run 5600 DDR5 memory, I have a high speed SSD where the game is installed, etc.

Obviously something is causing a bottleneck, but I’ll be damned if I know what exactly that is. And having frame gen disabled definitely has an impact.

I personally feel its just not coded efficiently or properly or something, or there is a translation issue between the driver layer and game layer that isn’t meshing like it should.

We should have the ability to run this game on mid/high tier components with most settings enabled. When you have a top tier system with the best GPU installed and still can’t run the game at its maximum settings with reasonable performance results, there is a problem.

When Cyberpunk with Pathtracing runs smoother than D4 with RT, than you know something is wrong :smiley:

CP2077 was designed with nvidia as a ray tracing/ path tracing showcase. I think ray tracing was tacked on later in the development cycle D4. I’m actually excited its in the game, I’m sure they can improve the performance over time and give us some really great looking scenes. 4K with ray tracing maxed near the tree of whispers looks phenomenal

Actually D4 had RT already in the closed beta (didn’t work properly though) and was supposed to have it “shortly” after launch (which obviously didn’t happen, most likely because of other severe performance problems with the engine like the VRAM/Texture memory leak).

Maybe it will have a bigger impact in the first expansion but right now you literally have to compare screenshots back and forth in most places to see a (small) difference.
It’s not worth the performance impact and that is coming from a 4090 owner.
But i agree, they definately could improve the performance of RT, at least the CPU-bound part.

I just compared it with RT on high and off, even made screenshots.
I cannot see a difference except the shadows being a little softer.
Reflections in the water look really bad even with RT on high, so low res.

I guess a feel better that it is just poorly implemented and it is not my pc

Heh. The latest update still has the problem where textures are dumped into virtual memory instead of system RAM when being swapped, so the game eventually has an “out of memory” error, but there’s plenty of system RAM available. I go look at the Fenris logs and it shows a virtual memory error. VM shouldn’t even be touched unless system RAM is running out.

And now you know why I had to actually use a swap file after upgrading to the 4080S whereas when I had the 3070Ti I didn’t even have a swap file active…

Only game I have played where this was a problem. At least in my case I can play it without problems after said adjustments.

But considering a huge majority of games out there don’t require you to have to monkey with the settings to get them just right for the game to play, Diablo IV has some seriously finicky operational requirements.

Not to mention being overly sensitive on the internet connection reliability too.

Its no wonder so many have problems.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.