This is a very dense topic because it envolves subjective perception, physiology of vision and display technology. The perception of those effects can vary wildly and not everyone will notice. Some people are more sensitive about motion and others about picture resolution for example. I already made a topic talking about motion blur here and I explained a lot of those things there, it also applies to why 240Hz or even 390Hz isn’t “enough” for all situations.
Blizzard PLEASE add an Motion Blur option on the engine - General Discussion - Overwatch Forums
But, this doesn’t mean you can’t play very competitively with 60 FPS/60Hz for example. We as human beings are very capable of adapting and get used to some visual artifacts to the point that it might not bother anymore.
Displays are just fixed objects changing colors on tiny squares, what we see and perceive as motion and images are just an illusion. Screens doesn’t even have depth, but we can perceive “depth” in some sense looking at 3D images on a 2D screen.
Of course our brains can perceive a lot more then 30Hz. 30Hz is so low that we can actually perceive the flickering at this frequency. Movies are shot at 24FPS and displayed twice on 48Hz strobing lights back in the old days because 24Hz is atrocious bad with flickering.
They choose 24FPS because it’s the bare minimum necessary to our eyes perceive static images changing as “moving objects”. This works because our eyes have an natural “motion blur” of some sort, which is called persistence of vision.
However, capturing real images with a camera and playing games are VERY different things. With cameras, you can match the shutter speed to “simulate” an natural effect of motion blur as our eyes would perceive, and blend the frames together in a more believable way. Games are discrete images and are displayed like a movie recorded with a VERY fast shutter speed. You can easily see examples of this on youtube, search for high shutter speed vs low shutter speed in videos.
Because or eyes perceive this discrete images at an given point in time (also called flicker fusion threshold) and “blur” then together, it creates the effect called “stroboscopic effect”. We see multiple discrete images at once and “gaps” between then.
This is also the reason why when you move your mouse cursor fast on your desktop, it gives an “illusion” of multiple images and gaps between then. This can be called phantom array effect.
You can “simulate” this effect taking a picture of your screen with a slow shutter speed while moving your camera on the game quite fast.
This is one of the “gaming artifacts” that we don’t have a really good solution to fix. The “ideal” way is to have a framerate, displayrate and polling rate SO high, that you just can’t physically move your camera fast enough to trigger and gap between pixels. But of course, this is not tangible in any way.
Some solution might be an solution that interpolate frames and “amplify” the framerate to a point that you can have thousands of simulated frames between real frames. However, this would only work with an stupid high display rate to display those frames.
And finally, the reason why I created the topic above. An good per object motion blur implementation on the engine would be a good solution.
While it isn’t perfect in any way, it’s tangible and have little to no impact on performance. A good motion blur implementation needs a big enough frame rate to have enough samples to create an believable effect with less artifacts. So the most “tangible solution” would be the maximum framerate and display rate you can get AND to use a good per object motion blur implementation.
A lot of people dislike and don’t like motion blur in games for a lot of reasons, reasons that are very well explained on a Digital Foundry video about this topic. As a lot of things in the internet, more people dislike because they don’t know how to use and when, and because bad implementations on some games. It’s the same thing as people avoiding VRR as a solution for tearing because “input lag” reasons.
Some people also get motion sickness with motion blur, which I always find very strange since every movie does have motion blur. The curious thing is some people misinterpret the stroboscopic effect AS motion blur, and this is exactly an artifact created by the lack of motion blur. Unfortunatelly, it isn’t easy to show this effects in videos for the same reason you can’t really understand 240Hz gaming without actually USING a 240Hz screen at 240FPS.
So when you’re going at higher refresh rate and framerate you’re actually moving the “ceiling” for stroboscopic effect higher, you’re lowering your overall input lag, you’re lowering the amount of motion blur caused by image retention on LCDs (this motion blur is ALWAYS bad, it doesn’t actually help to mitigate the stroboscopic effect and also decreases the sharpness of moving objects). All those things are good and make motion and gaming in general more “believable” by our brains and should create an better experience overall. Specially in a very fast paced game as Overwatch is since this type of games is where those things are MUCH more noticeable.
Competitive gaming isn’t really a good place for reasoning though, hence the reason why this discussions doesn’t pop much here.