Genuine question I would like an answer to :
If most people have all these problems with this… why is your post the first I’ve ever heard of this? Possibly because I don’t hang out in tech support or search for graphic problems with OW (I don’t feel there are any, but I also don’t get motion sick).
There was a time where I used to play quake 3 arena online on my old pc at 10-15 frames per second…oddly enough I got used to it and would able to get some snipes with the railgun
That’s debatable. An per-object motion blur implementation (assuming it only affects the heroes and projectiles and not the camera movement) would make those A D Lucios, Tracers and Mcrees easier to track in my opinion.
The characters on this game have a lot of acceleration and can change direction extremely fast, so, even if your camera is stationary and you’re looking to someone spamming A and D, you’ll see a bunch of gaps on their movements, to the point you can barely know the direction they’re going. Motion blur can mitigate this and per-object motion blur will not affect the camera motion.
It just gives an illusion of a higher frame rate for the objects (not the environment). And since Overwatch already have subframe input (high precision input), it might be easier to track and shoot.
I can’t post images and videos to illustrate what do I mean but I hope you can understand… This is explained on the Digital Foundry video I posted with video examples.
Good question. I’d say it’s just misconception and ignorance about this subject.
Some people DO see this problem, but misinterpret (ironically) as Motion Blur: Can anything be done about double vision / blur effect? - General Discussion - Overwatch Forums (blizzard.com)
Some people argue that high refresh rate displays doesn’t make any difference.
Some people don’t see aliasing as a problem.
Some people got motion sickness using lower FOV options.
Some people got motion sickness using higher FOV options.
Some people barely perceive tearing, and other ones can’t stand a single tearing on the screen.
Some people are extremely sensitive to flickering, others can look at a 60Hz display the entire day without a problem.
Some people can’t stand low resolution games. Others can barely see a difference going from 1080p to 4k.
So, the answer is… People are just different. We’re not equals regarding genetics and most of then see or feel this artifacts but doesn’t know what it is or what to look.
I’m playing games since 1998~, I remember how it was to play Counter Strike 1.5 and 1.6 with 85Hz and 100Hz refresh rate with a very low persistence blur on CRTs. As soon as I “updated” to a 60Hz LCD screen I could barely see a thing in the display, going to a 1ms or 2ms persistence straight to 16ms is a big deal. Most people who didn’t experienced this transition have no idea of what this is. This even changed how most people play FPS nowadays, most people nowadays have fixed eyes on the crosshair and use only the peripheral vision on the enemies before moving the crosshair (fixed-gaze tacticts). Back when we had CRT motion clarity, people used other tactics.
Nowadays we have displays who “simulate” the persistence of CRTs. ULMB, Lightboost, Benq Blur Reduction, DyAC, etc. See the Example of Certain Competitive Advantages of ULMB post of the Chief Blur on blurbusters forum.
If you do a google search, you’ll see a bunch of posts of people having visual problems on Overwatch that they just can’t explain. I already posted one above but see:
Disable Frame Blending in OW? - Technical Support - Overwatch Forums (blizzard.com)
Visual “stutter”? - Technical Support - Overwatch Forums (blizzard.com)
There’s a lot of posts out there regarding this problem, but they’re just don’t know HOW to explain.
If you was playing at CRT then the motion clarity would be very good even at this low frame rates.
A lot of “gaps” between frames though. But since CRTs have very low persistence it is like to see a video with a very fast shutter speed. Seems unnatural but very crispy looking.
If the point was to make them more visdible when they move that can be done with vfx. In fact lucio and tracer already have traces (pun intended) after them.
Motion blur will just make everything more… well, blurry. You will be hitting the blur and not the actual hero.
oh so your one of THOSE people that are “oh so you don’t agree with me? your clueless, ignorant, and a troller and your opinion doesn’t matter”
Computer says no…
You will not hitting the “blur”. The motion blur just helps you to understand the movement of a character. Without this, fast moving objects just appear and disappear on the screen. Just shake fast your mouse cursor on the screen and tell me if you can say if the cursor is moving right or left in a given moment. You can’t because from your POV, the cursor just appear and disappear multiple times with visible gaps between then.
And you can hit someone who’re not showing in frame, this game have sub frame input so you can flick a position between frames already.
Every graphic artifact can lower your reaction time. You can have freesync/gsync for an experience of tearing with a very tiny cost of input lag. The question is… This tiny cost on input lag will be mitigated by the lack of tearing?
The same can be said about aliasing, the same can be said about LCD blur persistence and the same can be said about the stroboscopic effect we get without an motion blur. That’s why I said it’s debatable.
If you’re be hitting an “blur” or “trail” of the character, you’ll be hitting an gap without an motion blur option. What do you prefer? In both cases you would miss, but at least with a blur you can easilly see the direction that the characters is moving and not that stepping effect we had with the gaps.
I am fine with motion blur as long as I have options to remove it.
Making sure this is a visual here.
I have Astigmatism and motion blur makes life worse for me. So I always have it OFF.
You might be able to force multi frame antialiasing through drivers.
Also, has this ever been implemented in online multiplayer games? I don’t think it’s actually possible at realistic latency where positions are effectively probabilistic.
Turn on motion blur; how to make me motion sick in one easy step.
The OW team may implement it in OW2, but I kinda doubt they’d push it to OW1. I’d be fine with it if it could be turned off.
If I create an topic and the VERY FIRST answer is “just put vaseline on your screen” then yes, I’ll be “that guy”. I don’t have the patience to answer stupid people, but I do have the patience to answer polite people.
There’s a lot of people here who don’t act like this.
You don’t need to agree with me. Motion blur was not “created” by me, it’s an big topic regarding cinematography and gaming.
Per-object motion blur DOESN’T affect camera movement, so it shouldn’t lead to motion sickness.
It only creates an “illusion” that characters and objects are rendered in a higher framerate.
Sorry, you don’t get to decide what causes my motion sickness. I wish you could; it would be sweet to not have to shut it off every time I play a fast-paced game.
Am I understanding this wrong? Are you saying 240hz with gsync isn’t good enough? Frame changes on a 240hz monitor are visible and actionable by you? To the degree it’s ruining the game?
Personally I don’t care if they add blur, so long as there’s still an option to disable it. But not understanding what’s needed beyond gsync to fix this for op.
More options are almost always great.
But the real question is: How much effort would it be to implement it and how many people would actually use it?
I personally think the long term solution to the problem is in fact higher framerates and faster monitors.
Sure, you will still have gaps if the the movement has more pixels/s than you have fps, but the closer your fps is getting to the pixels/s, the lower is the impact of the gaps, because they get much shorter.
If you have a movement with 1000 pixels/s at 60 fps, then you will have a ~17 pixel gap between images.
At 340 fps it is already down to ~3 and therefore barely (if at all) noticeable.
At least for PC the trend is going strongly to 300-340hz becoming the new standard within the next few years, so from Blizzards perspective it probably makes much more sense to put resources into optimizing the efficiency of their game engine instead of trying to implement motion blur in a way that makes sense and is used by a huge part the player base.
my guy…everyone can clearly see that what that person said was obviously a joke. who the hell would put vaseline on their monitor for motion blur??
I’m not “deciding” for you, I’m just explaining.
You’re thinking about CAMERA motion blur and I’m talking about OBJECT motion blur. You can have blur applied to objects on the screen but not to the camera movement, this means when you turn your camera the environment will look exactly the same.
I’m trying to demystify this subject because the vast majority of people doesn’t know what are the various types of motion blur implementation that exist in games. And if you see the video about motion blur of Digital Foundry you’ll understand what I’m talking about and why you’ll not feel motion sickening with per-object motion blur. If you’re lazy to see the entire video just skip to 18 minutes.
You can have separated options and you can adjust the strength of the effect.
Everytime you move faster then 240 pixels/sec on a 240hz display you’ll have gaps between frames, you can easily understand this concept moving your cursor to left and right very fast and observe. You’ll see multiple cursors and gaps, and you can’t say for what direction the cursor is moving. From your point of view, the cursor is just appearing and disappearing side by side. This is an exaggeration of what happens in fast movement in a game.
It really doesn’t matter how many frames or frequency of a display you have if you’re moving fast enough. This is not only an “Overwatch thing”, but in Overwatch this strobing effect is specially annoying because characters have very fast acceleration to change directions and the outline on characters just make it even more annoying.
Of course I can play. But the motion clarity can be vastly improved with a proper per-object motion blur.
No one asked for high precision input (sub frame input), but we get it anyway. In the beginning of this game, we had half the tick rate.
So maybe, the only way to know is to implement. With a proper implementation most people will actually start using and will notice an improvement.
And I didn’t understand your math. An 1000 pixels movent/sec would translate to exactly 1000hz of movement without gaps. This is exactly one of the reasons why motion blur is needed, because frame rate and display frequency will never be high enough. But a higher frame rate and frequency would give better samples to make an better “blend” on the motion blur effect. Also, higher frequency means less blur on the screen itself (sample and hold effect) (the BAD LCD BLUR).
Of course higher frame refresh rates and frame rates helps, but it’s expensive and they’ll be probably will never be high enough to eliminate stroboscopic effect.
And the next reply was other ignorant who clearly was not “joking”. As I said, I don’t have the patience for these type of people. So I don’t care if you think I’m the “ignorant” here. This forum has enough stupid threads and posts already.
Alright so I looked into games that I know made me sick vs games that didn’t.
Warframe uses camera - terrible sickness
Spiderman uses object - no motion sickness
Doom 2016 uses object - definitely felt sick
So I guess it depends on how they do it. I did say that I’d be cool with blur in OW as long as it had a toggle.
if they do thats cool but DONT put it on by default… please lol
For those who still don’t know the difference, go to the video about motion blur from Digital Foundry. Go to 18:20 minutes, and see the Prey example, 10 secs of that video can explain the difference. Or see this pick i.postimg.cc/QdRNjQmk/prey.png and notice how the blur is applied to the moving object (the knife and arms of the character) and not the environment/background.
This is why talking about motion blur is almost impossible, most people don’t understand exactly what it is, when it is applied and how.
Exclusive per-object motion blur is not a “thing” in most games, even for those who have GOOD implementation of it, like Doom 2016. In Doom 2016 you have both CAMERA motion blur and PER-OBJECT motion blur, you can lower the amount of camera motion blur going for the “low” option but you can’t remove it entirely. And if you remove it entirely, you’ll get rid of per-object motion blur also. And this is not what I’m advocating here.
But you can do EXACTLY what I’m talking about in Prey for example and ONLY have blur applied to moving objects and not the camera motion, hence the reason WHY I’m talking about per-object motion blur implementation and not “general” motion blur implementation solution. If we do have camera motion blur, it must be COMPLETE SEPARATE OPTION FROM PER-OBJECT MOTION BLUR.
It doesn’t matter much if it can be turned off but I agree.
Frankly I don’t care one way or the other so long as I can turn it off (like you said) and it wouldn’t provide any unfair advantages
Plus at least for PvE it might look nice