[LONG RAMBLING INTRODUCTION STARTS HERE]
Like a lot of people who watched the battle(non)sense video a while back I’ve sort of been lost on the right way to set overwatch up these days. I’m the type who has a bad habit of changing little settings back and forth if I can’t see a clear difference right away (optometrist visits are the worst).
Still, I was sure there had to be a difference, however small, between NULL settings and reduce buffering on/off in game. But until recently it was more of a nagging feeling (or something to obsess over when I’m playing bad) and I couldn’t ever rule out placebo.
While messing around with frametime logging in RTSS, I decided it might be worth testing in Overwatch to see if I could find any quantitative difference between some of these settings to account for my placebo fever.
It turned out to give me consistent enough results between various setups that my inner armchair-scientist decided to put on my pretend-labcoat and turn it into a little mini-experiment, in the hopes that maybe somebody else might wanna follow up on it, or at least find the data interesting.
My setup is pretty high-end: G-PRO hero (wired) for mouse, EVGA RTX 2080ti ultra for graphics, and an HP omen X 25f as my monitor. I keep my background services/apps at an absolute minimum, I use a little tool called “set timer resolution” that brings the OS timer down to 0.5ms, and my mouse polls 90-95% of deltas in the 1-1.1ms range using mousetester.
Essentially, my setup is pushing the limits of how low you can actually get input delay, and is extremely consistent with few to no framerate stutters. Therefore, any variability should only be coming directly from the difference in settings themselves.
I started by initially including different fps caps, but quickly found out that since my system only ends up at ~50% GPU usage even with max settings, it was the same no matter how I set it up.
I considered setting the render scale to 200% or higher to max my GPU, but since plenty of people (including battle(non)sense) have tested this already, and since I have zero reason to actually PLAY with such a silly render scale, it seemed impractical.
On 240hz I also found no consistent change in frametime when setting the in-game fps cap to either 237 (for gsync), 240, or 300. It was the same in all cases.
Curiously, setting the in-game fps cap to “display based” produced the same medians as the “custom” cap, but also resulted in fewer 0.1-0.2ms fluctuations from the median/minimum in all cases, so this is what I ended up setting it to.
Could be something in overwatch itself is optimized for this setting, I guess?
Well, with all that said, here’s the set of trials I ran.
[LONG RAMBLING INTRODUCTION ENDS HERE]
[METHODS]
I ran each combination of settings through a 5 minute trial that consisted of running around in the training room shooting at bots with the RTSS in game overlay rendering only frametime and fps, logged the frametime results in afterburner, then figured out the overall impact each combination had on frametime.
I didn’t use a spreadsheet, I just looked at the graph, found the relative median, which also ended up being the minimum. Around 80% of all values were the median/minimum (except in one case), so it was a pretty stable test. I then found the largest peak that happened more than a few times (to rule out unrelated outlier spikes) and figured out the ranges.
[RESULTS]
- FIXED REFRESH + NVIDIA NULL SETTINGS
FIXED REFRESH
VSYNC OFF
NULL - ULTRA
median of 3.6ms
variable range of 3.6-4.8ms
FIXED REFRESH
VSYNC OFF
NULL - ON
median of 3.4ms
variable range of 3.4-4.6ms
- in both cases, turning “reduce buffering” ON in game added an extra 0.2ms frametime to the median/minimum
- When using fixed refresh, NULL - ON does indeed produce faster frametimes than NULL - ULTRA.
- GSYNC, VSYNC, and NULL SETTINGS
GSYNC ON
VSYNC ON
NULL - ULTRA
median 4.4ms
no variable range or any variation for the entire 5 minutes
GSYNC ON
VSYNC ON
NULL - ON
median 4.2ms
varies between 4.2-5.0ms
GSYNC ON
VSYNC OFF
NULL - ULTRA
median 4.3ms
varies between 4.3-5.1ms, but almost constant fluctuations
GSYNC ON
VSYNC OFF
NULL - ON
median 4.5ms
varies between 4.5-5.4ms
GSYNC adds at least 1ms over fixed refresh. This isn’t much, but the Omen X 25f has close to the lowest input delay GSYNC you can find, so other monitors will likely add a fair amount more than this. We’re getting close to point where the slightly slower input delay in exchange for the reduced motion blur may finally be a reasonable trade for a competitive player to make.
- In all 4 cases, turning on “reduce buffering” in game added an extra 0.3ms to the median/minimum frametime (slightly more than with FIXED REFRESH)
A lot of surprising results here. Turning VSYNC ON in the nvidia control panel or enabling VSYNC in game (made no difference which) actually resulted in faster frametimes with NULL ON, but slower frametimes with NULL ULTRA. I can’t even begin to explain what that’s about.
- NULL ON performs slightly better on the median/minimum with GSYNC/VSYNC, but slightly worse with just GSYNC.
- NULL ULTRA with just GSYNC and no VSYNC produced a lot of micro-fluctuations, and is the only combination that showed ~30% variability.
- By contrast, NULL ULTRA with GSYNC/VSYNC was 0.1ms slower, but had literally zero fluctuations. A 4.4ms flat line from start to finish.
[CONCLUSION]
Basically the TLDR summary is this
- 240hz hardware has almost the same frametime with gsync on as fixed refresh
- GSYNC has roughly the same frametime when VSYNC is on or off, VSYNC doesn’t seem to add additional frametime delay.
- Using VSYNC ON from the NVCP or using VSYNC enabled in-game made no difference in either median, range, or fluctuations.
- Using any NULL setting besides “OFF” (ON OR ULTRA) plus turning “reduce buffering” ON in-game adds a tiny amount of fixed delay in all cases. Better to pick one or the other since it offers no benefit.
- NULL “OFF” + “reduce buffering” ON in game was equivalent to NULL ON both in terms of median frametime, range, and relative fluctuations.
- NULL ULTRA is slightly slower than NULL on, which agrees with Battle(non)sense’s data.
- NULL ULTRA tended to stay closer to the median with fewer micro fluctuations in every case except one, so it may be useful even if it does show higher latency
Keep in mind that while GSYNC+VSYNC may look pretty close to fixed refresh, this may not necessarily reflect the actual in-game input delay produced by the extra 1 frame (at most ~4ms, but 3.4ms vs. 7.4ms can feel quite different). Also keep in mind that RTSS isn’t measuring input delay directly, only frametime, literally how long it takes to render a frame, which is one component of input delay.
You gotta measure this from the real world with a high speed camera to find real input delay since there’s still monitor delay, usb input delay, and possibly even other factors like network delay added on to the raw frametime.
I think the biggest result for me was finding that the GSYNC/VSYNC NULL -ULTRA combination was so 100% steady. With ZERO fluctuations, I was sure I must have forgotten to reset the logging tool: the graph was literally a flat line!
Nope, I didn’t forget to set up the log, it was chugging away at 4.4ms for the whole 5 minutes. I even ran it again for an additional 5 minutes and sure enough, it remained a constant 4.4ms the entire time.
Personally, I think I’m going to try playing like this for a while, since while a difference of 1ms from FIXED REFRESH vs. GSYNC means very little (even accounting for the theoretical +4ms from 1 frame sync delay), the 100% consistent frametime output with zero microstutters felt incredibly solid game-wise.
I’m sure if I ran it for long enough eventually the hardware would give me a hiccup, but 99.999% consistency could do wonders for the ole’ muscle memory.
Also, before you laser in on finding only the lowest possible numbers, consider that a 0.2ms difference is physically impossible for any human to detect. I don’t care how batman you think you are. Action potentials are limited by the speed of sodium, so once you get below ~0.5ms it ain’t happening. Brainnerves not work such way.
Finally, in closing, this isn’t a super serious controlled experiment, it was more to satisfy my own curiousity, and I thought the results were interesting and consistent enough to be worth sharing.
You are more than welcome to repeat these mini-trials with your legit labcoats on.
[SUDDEN SOBERING REALIZATION]
omg i spent hours dealing w/ milisecond differences instead of actually playing the game
wtf was i thinking
[DISCLAIMER]
I am aware that this seems to disagree with the work done by the guys at blurbusters on a few points, particularly where FPS caps are concerned. I think this may either be the result of ONLY measuring frametime, the fact that I was never GPU constrained, or the consequence of only testing on my own obsessively-tuned hardware compared to range of systems that they test with.
[THE GUYS WITH THEIR OWN WEBSITE PROBABLY KNOW MORE THAN THE GUY WHO MADE A FORUM POST]
When in doubt, trust the experiment with the most data.