So I thought I would check out the the ray tracing and see how it is. First of all I run a beast computer:
I9 13900k
ASUS Tuf RTX 4090
64 GB DDR5
I can play any game I have come across at 4k ULTRA and hit 100FPS minimum(maybe 1 or 2 titles 80-90FPS). MW3 Ultra over 200FPS. I turned it on in this and the game can barely maintain 50-60FPS in combat and seen drops as low as 37 in a dungeon by myself. What gives? In addition to this I see almost no difference in appearance in multiple side by side screenshots I have taken. Not impressed at all. I think some serious optimization needs to be done.
I agree the impact is minimal⌠so I just donât care.
The baked in lighting was obviously so âgoodâ that RT isnât really needed, at all.
I9 - 13900k
Zotac 4090
128 GB DDR5
4k DLSS on - Balanced - Frame Gen On
All options ultra/max
Ray Tracing - Max
Sat at 120 fps through a 96 vault solo.
5800x and 4090 strix, i was getting 40-70 fps when i bumped all of the ambient occlusion and RT settings to max, but then i changed my resolution scale back down from 150% to 100 and get 120+ frames consistently. i only use frame generation, i dont use DLSS
I dont use fram gen for very simple reasons: first I built a high end machine so that I wouldnât have to rely on fake frames. Second, . FG induces input lag and the âfake framesâ can show something that isnât there and vice-versa.
In addition, donât you think its just a bit sad that you, with that beast of a machine, need artificially created frames to run smoothly?
Also, 200+FPS in MW3 in ULTRA 4K Multiplayer with ZERO FG. Something is clearly wrong here.
I donât even use the 4k monitor for playing. I have a bunch of G7 240hz I typically use. Do I find it weird. No. Itâs the price of 4k.
I typically use the G8 4k monitor for work, but saw this and figured iâd test a bit.
lol @ input lag in diablo. Show me where it matters. this isnât an fps. You can delay everything 150 ms it wouldnât be off putting and you are adding less than 50ms.
Nothing wrong, itâs a completely different sort of game.
Your framerate is also dependant on your class. Play a sorc, and it will crush your fps.
Perhaps with the 5090 youâll get the kind of performance you desire.
I didnt see enough of a difference and turned it off myself.
Thatâs what Raytracing does. Was always a scammy overhyped tech for me. The best are RTX Remix videos on youtube bringing 20 year old games to 40 fps on high end PCs.
Raytracing is probably the most overrated technological evolution in years.
Ray tracing is a meme. Go look at how insane Horizon Forbidden West looks on PC, even without it.
I would have to agree for the most part. I actually canât wait to try that. The other one looked amazing and I heard this one was great to.
So if a top spec rig canât maintain high frames with RT turned on what hope is there for the rest of us? I can only assume Blizzard implemented it now in order to test it/gather data prior to a proper implementation in the expansion.
Ray Tracing in Cyberpunk yield some quite nice results!
In D4 though? MehâŚ
Though, from the designer point of view, it might help them reduce asset size since light didnât need to be precalculated or somethingâŚ
DLSS and FSR.
These top end rigs are not using those, if they say they are struggling for frames.
And they shouldnât have to. Frame gen is designed for the not high end machine to be able to cope better with newer graphics.
Yes that was his question that I answered.
I donât turn those on.
Sorry I was actually agreeing with you lol
All frames are fake by definition. All frames are based on calculations using various means of âcheatingâ to lower the processing load and you arbitrarily complain about one way of âcheatingâ without presenting any technical arguments to support your case.
Iâd say that D4 is in fact a showcase for FGâs usefullness on highend machines due to it being heavily CPU limited to and FG completely removing that limitation.
A high end machine especially running a 14900k/13900k should NEVER have to rely on FG and the idea of using it as âshowcaseâ as an example for High-end machines is the dumbest thing I have heard. More so since the whole idea of FG is to benefit LOW END machinces which otherwise would not be able to generate the frames.
The point youâre still not getting is that even the highest-end PCs can and will be bottlenecked by the latest and greatest software. Always been like that. Probably always will be.
The fact that you think it âshouldnâtâ be like that is nothing but an emotion.