Hello guys, I have a good PC, lets say overkill for overwatch, RTX 2080TI, 240 hz monitor etc etc etc…
I want to ask about Render Scale, by default I have 141% render scale, the game works fine with no input lag or fps drops.
my question is, should I set it to 100%? why many people (also pros) make it 100%?
does it lower input lag? will it give me more fps?
increasing render scale can drastically lower performance depending on your hardware, some people (and a lot of professional players) use normal/lower render scale because graphics don’t really matter.
not sure whether it can/does cause more input lag, afaik it has no effect on that.
More fps is less input lag, it’s been tested. But you’re not going to notice it after like 150fps. So keep it at 141. I’m assuming you’re at 1080p and getting over 100fps.
I have a 1080Ti which is currently overkill for my 1080p 144hz monitor and I keep it 150% render scale and it’s perfect with epic settings.
Edit: wait you have 240hz. If you aren’t at 240fps then put it at 100% to reach 240. Might as well make use of it.
1 Like
Graphics are set on low to help you see what’s going on in game. There is a lot of visual clutter on default.
As far as render scale, run whatever you can without frame drops.
I am at 240 fps already with 141%, I mean I drop fps from 300 to 250 sometimes… with 100% render scale will I drop to lets say 270? isnt it better?
To put it simply, there is no reason to ever use a render scale other than 100%. It will always lower performance, and potentially can also lower graphic quality, even if you’re setting it to something higher.
Best performance will always be setting your resolution and refresh rate to match your monitor’s native, at 100% render scale.
3 Likes
I play at 60fps and it dips to 58 at times. Which can be annoying when sniping. Just keep your graphics cleaner you don’t need more fps if you’re already over 200
Okay that’s good. I can’t say for sure what’s causing your drops as it’s highly variable. But you could try dropping shadow settings, local reflections and ambient occlusion down and see how consistent it is after. Those settings hurt frame rate the most.
Always 100%, in every game. And I’m running a 1080ti.
thank you everyone I will set it to 100% 
You don’t have to in every game.
Competitive multiplayer games sure but if you care for visuals and AA, increasing render scale improves that. It’s called super sampling.
1 Like
It depends on your rig. I play at 165hz at 1440p and max performance is already great and quite demanding. Increasing the render scale visually does very little for the human eye at those settings, and only serves to put more stress on the hardware resulting in less FPS.
1 Like
It can have an effect, but the render cost is massive compared to the visual difference (which is usually indistinguishable from anti-aliasing).
Render scale is mostly provided as a future-proofing option, in case GPU downscaling tech becomes drastically superior to anti-aliasing at some point.
1 Like
Higher render scale can lower FPS, but if you’re having no trouble with it then it’s fine.
Lower render scale also increases size of player outlines, but it makes things look a lot blurrier. Pros keep it at 75% or 100% because they’re solely focused on having the highest FPS in all situations.
1440p that makes sense you don’t need to increase it.
If you’re stuck with 1080p super sampling is great since aliasing is worse at lower resolution and the performance hit isn’t bad if you have an overkill card.
1 Like
I noticed the red outlines… thing is, people say it makes it easier to see targets and aim.
It can, but it’s personal preference. I find 100% to be better than 75%. Sure, outlines are slightly bigger on 75%, but at 100% the lines are still clearly visible, I don’t experience FPS drops, and everything is clear as day. If you really want the thicker outlines, then drop it to 100% or 75%. Anything below 75% gets super blurry, so you should probably avoid it.
I made it 100%, I think its good enough.
To be honest I never understood why OW defaults to awkward resolution scales like that. Anything thats not an even multiple of your resolution will result in filtering artifacts from scaling (e.g. 141% means every physical pixel maps to 1.41 rendered texels - that can’t map accurately to the display).
Generally anything >100% is used as a form of antialiasing called super sampling, though in practice I usually only see it at multiples of 2. That’s usually absurdly expensive so most things never use it in favor of more common AA techniques such as MSAA or FXAA. I guess you can probably get some decent results with only a fractional increase, but that seems odd to me…
Using lower than 100% is generally only valuable for performance reasons, since you’ll simply start losing image quality.
Personally I’d probably recommend 100% scaling and some form of AA to account for jagged edges. As long as you never drop below 240hz it won’t be a problem. Surely you can shoot for a stable 300fps, but at that point you’re starting to hit the corner cases of diminishing returns on input latency. I doubt anyone would eve rbe able to tell the difference 
1 Like
See that’s what my little monkey brain tells me, but why is there an option to increase it above 100%… does it force full rendering of objects in the distance so they don’t go from crap texture to full texture as you get closer?