I now wonder at what point FPS just becomes a number to brag about, when a player wont notice a difference between 120 fps and 12 billion FPS because the human brain is capped at viewing 720p at 150 FPS. (Unless you jam a USB C cable into your ear and disable Brain Clutter and Rachael Ray Traced Shadows)
Personally I noticed between 60fps and 120fps. Not as significant as 30 to 60 but definitely smoother. If I go up to 180 (the max refresh rate of my monitor) I barely notice a difference. So Iâve just been capping games at 120fps.
When FPS is more then what the server tick rate is it doesnât really make much difference. It is drastically reduced. So if the server tick rate is 60 for example above 60 fps sees diminishing returns. As your client has to simulate the data it doesnât have. Also remember your monitor also needs a matching refresh rate or higher to see the difference.
Input might be very, very slightly delayed@120fps (as in not perceived as delay since it would be so minimal, but could be measured with software) if you respond to a frame that isnât picked up on lower tickrate, but the game would still look smoother.
I definitely feel like 120 is the sweet spot. I run at 60 on my main screen because itâs a large 4k tv and I prioritize visual detail over frames in wow, but it would sure be nice to be able to push 120. Itâs quite nice on my other devices with smaller displays.
Diablo II Resurrected got DLSS since last time I played, just booted it up to test and it works well. Lets me max the game at 1440p120fps lock (except when running around Harrogath, I remembered that being most demanding before) in the areas I tested. It is around 90fps in other areas without DLSS. And that game does not have a monthly sub to keep on top of things.