120 FPS vs 300 FPS

FFS… I can’t believe that with the amount of accessible information, there’s so much misinformartion yet.

TL/DR: Our eyes can technically see and process infinite amounts of frames (because we don’t see in FPS). So a low FPS might be “enough” for some situations and not enough for others. It depends on the brightness, the source (games, recorded movies, animation). But SOME artifacts may not disappear even at thousands of frames per seconds.

Overwatch certainly is better with an higher refresh rate and FPS because the fast pace nature of the game. But it’s pointless to raise your FPS without going for a higher refresh rate display. Overwatch does have subframe input option (which does mean that you can hit things between frames). Some games poll the input information for the next frame and that’s the reason why it might be “better” to raise FPS values for some games even being capped on the frequency range of the display.

But being able to “hit” at the speed of your mouse doesn’t mean that 60 Hz is the same as playing as 240 Hz. For reasons that I explain below.

I created an enormous topic that brings this discussion in some way: Blizzard PLEASE add an Motion Blur option on the engine

End of TL/DR.

This is half true. Human eyes DO see constant changes in light. There isn’t an “exact” FPS value that makes an exact handicap where our eyes can’t perceive it any different then real life. Psychophysiology of vision isn’t that simple and different people has different values.

Also. It depends on brightness and setup. For example, you can EASILY notice a blinking light in the darkness, regardless of how fast it does blink.

For situations where you’re just standing waiting to click, yes. For intensive motion, not exactly. But of course, 600 FPS is pointless without 600 Hz display.

This is WRONG ON SO MANY LEVELS. FFS… I honestly believe he’s trolling.

This is a correct answer, unfortunately it’s very hard to research on this kinda thing. To be fair, it isn’t 1000 FPS either.

What creates confusion for misinformed people is the fact that our eyes retain information. This phenomenon is called Persistence of Vision or Ficker Fusion… Which does happen between 60 Hz and 90 Hz. This was easier to test back when CRTs was the main display technology. As an example, for me, it happens with 85 Hz (the point I don’t perceive flicker anymore in an strobing backlight).

THIS DOESN’T MEAN I CAN ONLY SEE 85 FPS. This is what causes confusion. You do see infinite amount of frames, but your eyes blur the frames together.

For videography, an professional do compensate this effect by using proper shutter speed while recording. This “blends” the image and creates an image that resembles what our eyes do everyday.

This apply to everyone. Just move your mouse arrow very fast on your screen. You perceive it as multiple arrows just “standing” there. Since they’re discrete images without any motion blur, your eyes blend then as multiple images appearing and disappearing instead of a “moving arrow”. This is called phantom array effect.

If it was an object in real life, you’ll perceive as an line of motion blur.

So here is where the individual psychophysiology enters. Some people can notice the “stroboscopic effect” while playing games very easily. Specially in game that is a fast pace mess with vibrant colors like Overwatch.

DIGITAL FOUNDRY has an spectacular video about Motion Blur that everyone who’s not biased should watch and rewatch to understand.

TO make 3D motion “believable” in a screen, it’s an incredible hard feat. For an slow paced game with constant motion it might be easy. But for a fast paced game with erratic movement, it isn’t.

Did you read this study though? It isn’t measuring how fast our eyes perceive information, it just measures how fast our brain would process an given task at a fast blinking environment. Something that it might be even easier on an CRT display within 75 Hz as they used in the test environment.

This has nothing to do about the psychophysiology of fast moving objects on a screen. You can see a ball moving at a fixed speed from left to right with 30 FPS, 60 FPS, 120 FPS, 240, FPS, 480 FPS and I’m sure you can say that the moved from left to right at the same speed on every case.

HOWEVER. What you’re missing is the lower FPS values do create “gaps” between frames (because it doesn’t have any to display). Those gaps are “blended” in your eyes and creates the infamous stroboscopic effect. You notice the movement, but “distorted”. In other words… You see an movement that resembles an stuttering motion.

This effect might be extremely distracting for some people and not so much for others. But they’re certainly measurable and can be “replicated” with cameras with a slower shutter speed.

This effect will ALWAYS be perceivable everytime you move faster then the Display Rate-FPS/ Pixel rate in a single second. In other words… It happens when you do a motion bigger then 60 pixels in 60 Hz in 1 sec, 120 pixels in 120 Hz in 1 sec, 240 pixels in 240 Hz in 1 sec and so on. That’s why motion blur was created in the first place. It wasn’t created for a more “cinematic feel” or to mask “bad performance” as some average Joe might believe. It does exist to correct an inherent artifact created by the fact that games display motion like a fast shutter speed recording in some sense.

This is correct. Some people just don’t understand that a 2D display with limited refresh rate that displays images pixel by pixel has limitations that our eyes just DON’T have. We already have almost 100% audio fidelity in a digital environment. But vision is MUCH more complicated then that.

This has nothing to do about “reaction time”. As I said above… Our eyes “blend” information (flicker fusion threshold). So blending less information might give you an hard time to perceive the correct motion of a target.

This is why a test like the “Reaction Time Test” in that humanbenchmark site wouldn’t show much of a difference going from 60 Hz to 240 Hz. But tracking a fast target with erratic movement would appear extremely different on the screen.

Eye-hand coordination need this information to be precise enough. You don’t “need” to “think” to track a target with your eyes, regardless of how fast they are. You don’t need to readjust your focus every 160 ms.

Can you track a fly moving around? Do you seriously think your eyes only move to the fly every 160ms because you’re “capped” by your “reaction time”?

I seriously can go on and on about this topic, It’s something I like a lot and I can pinpoint that the best source for this type of information might be the Blur Busters website.

Also, as I said already, the Digital Foundry video about Motion Blur does help a lot the understanding for those people who keep in denial about this topic.

I’d say that 60 Hz might be confortable enough for most people at average level to play consistently at a high level. It doesn’t mean that going above that is marketing mumble jumble. But even at 1000 Hz display we can’t just solve some artifacts that might appear in fast paced scenarios. Hence the reason WHY I created an topic about motion blur in the first place.

3 Likes