Ray tracing makes no difference and kills computer

So I thought I would check out the the ray tracing and see how it is. First of all I run a beast computer:
I9 13900k
ASUS Tuf RTX 4090
64 GB DDR5
I can play any game I have come across at 4k ULTRA and hit 100FPS minimum(maybe 1 or 2 titles 80-90FPS). MW3 Ultra over 200FPS. I turned it on in this and the game can barely maintain 50-60FPS in combat and seen drops as low as 37 in a dungeon by myself. What gives? In addition to this I see almost no difference in appearance in multiple side by side screenshots I have taken. Not impressed at all. I think some serious optimization needs to be done.

4 Likes

I agree the impact is minimal… so I just don’t care.

The baked in lighting was obviously so ‘good’ that RT isn’t really needed, at all.

2 Likes

I9 - 13900k
Zotac 4090
128 GB DDR5

4k DLSS on - Balanced - Frame Gen On
All options ultra/max
Ray Tracing - Max

Sat at 120 fps through a 96 vault solo.

5800x and 4090 strix, i was getting 40-70 fps when i bumped all of the ambient occlusion and RT settings to max, but then i changed my resolution scale back down from 150% to 100 and get 120+ frames consistently. i only use frame generation, i dont use DLSS

I dont use fram gen for very simple reasons: first I built a high end machine so that I wouldn’t have to rely on fake frames. Second, . FG induces input lag and the “fake frames” can show something that isn’t there and vice-versa.

In addition, don’t you think its just a bit sad that you, with that beast of a machine, need artificially created frames to run smoothly?

Also, 200+FPS in MW3 in ULTRA 4K Multiplayer with ZERO FG. Something is clearly wrong here.

1 Like

I don’t even use the 4k monitor for playing. I have a bunch of G7 240hz I typically use. Do I find it weird. No. It’s the price of 4k.
I typically use the G8 4k monitor for work, but saw this and figured i’d test a bit.

lol @ input lag in diablo. Show me where it matters. this isn’t an fps. You can delay everything 150 ms it wouldn’t be off putting and you are adding less than 50ms.

Nothing wrong, it’s a completely different sort of game.
Your framerate is also dependant on your class. Play a sorc, and it will crush your fps.

Perhaps with the 5090 you’ll get the kind of performance you desire.

I didnt see enough of a difference and turned it off myself.

2 Likes

That’s what Raytracing does. Was always a scammy overhyped tech for me. The best are RTX Remix videos on youtube bringing 20 year old games to 40 fps on high end PCs.

1 Like

Raytracing is probably the most overrated technological evolution in years.

5 Likes

Ray tracing is a meme. Go look at how insane Horizon Forbidden West looks on PC, even without it.

2 Likes

I would have to agree for the most part. I actually can’t wait to try that. The other one looked amazing and I heard this one was great to.

So if a top spec rig can’t maintain high frames with RT turned on what hope is there for the rest of us? I can only assume Blizzard implemented it now in order to test it/gather data prior to a proper implementation in the expansion.

1 Like

Ray Tracing in Cyberpunk yield some quite nice results! :slight_smile:

In D4 though? Meh…
Though, from the designer point of view, it might help them reduce asset size since light didn’t need to be precalculated or something…

3 Likes

DLSS and FSR.
These top end rigs are not using those, if they say they are struggling for frames.

1 Like

And they shouldn’t have to. Frame gen is designed for the not high end machine to be able to cope better with newer graphics.

1 Like

Yes that was his question that I answered.

I don’t turn those on.

Sorry I was actually agreeing with you lol

1 Like

All frames are fake by definition. All frames are based on calculations using various means of ‘cheating’ to lower the processing load and you arbitrarily complain about one way of ‘cheating’ without presenting any technical arguments to support your case.

I’d say that D4 is in fact a showcase for FG’s usefullness on highend machines due to it being heavily CPU limited to and FG completely removing that limitation.

A high end machine especially running a 14900k/13900k should NEVER have to rely on FG and the idea of using it as “showcase” as an example for High-end machines is the dumbest thing I have heard. More so since the whole idea of FG is to benefit LOW END machinces which otherwise would not be able to generate the frames.

The point you’re still not getting is that even the highest-end PCs can and will be bottlenecked by the latest and greatest software. Always been like that. Probably always will be.

The fact that you think it “shouldn’t” be like that is nothing but an emotion.

2 Likes