GUIDE: Maximizing System Performance for Overwatch (PC)

More people should be able to see this.

Post on Russian

Very well written.
I would just like to add something in the third-party software section.

  1. Razer Synapse
    This program is known to cause issues and crashes in overwatch. I would recommend not using it. Luckily for me I’m not a fan of Razer products so I don’t use any Razer hardware so I have no need for it. But if it is working for you with Overwatch without any kind of issues there’s no need to remove it.

  2. Rivatuner Statistics
    I use this software to track all my GPU/CPU temperatures and usage in games and other programs.
    EXCEPT in overwatch. I’ve never almost had any crashes. But when I used this in overwatch I got my first crashes. “Rendering Device lost” among with other errors. When I turned it off all the crashes stopped.
    Do NOT use this in overwatch. Use the built in “Display Performance Stats” in the Overwatch video options instead. Unfortunately there’s still no option to see any of the CPU’s usage or temperatures yet. But you can still see your framerate, GPU temperature, network performance and VRAM usage.

  3. MSI Afterburner
    Reaching high temperatures on your GPU?
    In this program you can set a custom fan curve in order for it to stay cooler.
    There’s a lot of guides out there which you can google.
    Most of the modern graphic card fans today doesnt even start spinning until 40-60 celsius.
    This is also (in my opinion) the best software you can use to overclock your GPU, but it’s something I would not recommend for beginners to do. But if you’re planning to overclock your GPU you need to start reading a lot information and guides about your GPU. And every card is unique. Some users who has the same GPU as you can overclock it a lot higher with lower temperatures, it’s just a matter of luck(or cooling method).
    I’m running the game with an overclocked MSI GeForce GTX 1080 ti Gaming X and working well for me but I have seen a lot of Overwatch players having issues with overclocked GPUs and the game. So I would not recommend doing this if you’re not already familiar with overclocking.
    This software is downloadable from
    msi. com/page/afterburner

  4. HWiNFO
    This is a great program to monitor all of your temperatures and usage.
    If you’re experiencing crashes or an unstable game this is a way to see if a component in your computer are reaching too high temperatures or if you’re having any types of hardware issues. It’s downloadable from hwinfo . com

1 Like

Thank you for the awesome guide. Changed a few settings in overwatch jumped from 60fps to 160. Heck of a different experience.

i am having fps issues in overwatch. i just recently bought a new pc with a nvidia geforce 2080 turbo graphics card and a intel core i7-8700 cpu. when i run the game on max setting i get 150 ish fps, when i run the game on all low settings including render scale i only get 170 ish fps.

i am playing on a 144hz monitor and i have overwatch set to a high priority in task manager. i play on fullscreen mode at 1920x1080 (144) any tips from anyone would be appreciated thank you.

2 Likes

Wait, so if I were to exit the program and only keep it on when I want to adjust lighting settings would that help? I didn’t realize it was ever a problem and have been playing for quite a while (a year or two) with it running. I’ve not really had crashes, but if there’s issues it’s causing that I’m not aware of I’d love to know what they are.

Intel (R) Core™ i5-7400 CPU @ 3.00GHz 3.00 GHz
Installed RAM 24.0 GB
64-bit operating system

Nvidia GeForce GTX 1060 3GB

https://www.google.com/search?q=xf270h&ie=&oe=

Acer xf270h

I cannot get 144fps. I will dip down to 90fps in heavy team fights.

I’ve tried everything in this guide. Do I just not have a powerful enough computer?

Thank you.

Your 1060 would be my bet to be the bottleneck.

4 Likes

How do I fix my low fps. When I’m in custom games and arcade my fps goes to 200-300fps. When I’m playing real matches like competitive and quickplay it’s 80-90fps and it’s really laggy. During Skirmish it’s also 200-300 fps. help! I use gtx 1060 on all low settings no shadow no nothing 50% render scale etc…

is overwatch an gpu or cpu heavy game?
will i get alot better fps if i upgrade i39100f to i5 9600k ?

What happens when a massive yellow bar appears? How can I troubleshoot this?

I have a problem bux my game I can play . ween I pres play . the game close the windo , fix tha plase

Nice guide but I think turning most settings to low is a bit overkill.

Overwatch has smart renderscale that is applied only to the 3D scene, the HUD and menus are unaffected. This is very clever because scaling results in blurring that is much less noticeable on 3D scenes (that are expensive to render) than on vector graphics (text, menus, crosshair, health bars - things that are cheap to render even at higher resolution). This means that it’s better to use the native resolution of the monitor with a lower renderscale than lowering the resolution.

E.g.: In case of an 1440p monitor it’s better to set the resolution to a native 2560x1440 and then use a renderscale of 75% to render the most expensive part (the 3D objects) in 1440p x 75% = 1080p. However, the scaling sensitive cheap-to-draw 2D vector graphics (menus, text, crosshair) will still be rendered in sharp native 1440p without blurring/scaling and without having much impact on performance.

Even in case of scaled 3D: some software scaling algorithms have better quality than the scaling built into the hardware.

In overwatch the renderscale seems to be a multiplier on the resolution (e.g.: 1440 * 75% = 1080p) while in some other games its a multiplier on the number of pixels. These aren’t the same scales because doubling the resolution both in X and Y direction results in 4 times more pixels to render. I think scaling the resolution (the approach used by overwatch) is more intuitive.

I’d also recommend using G-Sync or FreeSync if the monitor+gpu pair supports it. Adds minimal input lag but completely eliminates tearing. In some scenarios where rendering a frame takes only a little bit longer than 1 hardware refresh cycle for the monitor G-Sync/FreeSync can reduce the lag significantly compared to software v-sync (if someone uses software v-sync due to an allergy to tearing).

Last year I had only an i5 8600k with a GTX 1060 but I cloud play at stable 144Hz with 60-80% GPU load at 1080p with high settings and the most fancy features (reflections, high quality shadows, very high AA) turned off. Turning off some other settings (texture/model quality) have much less effect on performance so they aren’t worth lowering in my opinion.

I was actually using an 1440p monitor with native 2560x1440 resolution with 75% renderscale that has approximately the same impact on GPU load as 1080p with 100% renderscale but without scaling/blurring the text/menus/crosshair/hud/etc.

I’d recommend people to aim for a stable fix FPS (helps with aiming) and then tweak the video settings until this FPS can be maintained with an average gpu load of 60-80% while walking and looking around a map outside of a huge battle. (That fluctuation comes from rendering different number of objects between frames and the amount of fluctuation might be different for games other than Overwatch because it depends on how the game and its content are optimised.) With 60-80% GPU load there is still headroom for extreme situations (huge fights with lots of effects) with GPU load peaks while still maintaining a stable/fix FPS. Besides this, not running the GPU at max has other benefits: lower GPU temperatures, less noisy GPU fans, etc…

If someone aims for a fix FPS and less than 100% GPU load (based on my recommendation) then the renderscale must be set to a fix percentage instead of Auto.

I usually tweak video settings by starting up a custom game with 11 easy bots by playing DVA. This way it’s easy to check GPU load in various situations including a heavy fight with 11 heros and lots of effects in front of you because easy bots can rarely kill each other (or you because of your high HP as DVA).

You can also use the Practice range for tweaking because that might be quicker to start up but that has a bit lower CPU and GPU requirements than a normal map.

Ignore the GPU usage of the hero selection screen at the beginning of the maps because that has much higher GPU load than any other part of the game but without affecting gameplay. That screen might not be optimised: it might draw lots of things to the background perhaps with overdraw or redraw…

I can confirm that my hexa core i5 8600k is utilised quite well. Overwatch seems to spread the work across cores quite evenly and I’m never CPU bound with a 144Hz setup. (In contrast some games like StarCraft 2 seem to put most of the load on 1-2 cores.)

I always played with “auto select” without any issues or frame drops with steady 143-144FPS.

Last year I played OW with i5 8600k and a GTX 1060.

The GTX 1060 can easily output 144Hz with high settings (1080p, 100% renderscale and fancy features like reflection and high quality shadows and very high AA turned off) by using only 60-80% GPU.

Your i5 is an older generation with 4 cores and 3.0GHz base clock. My i5 is a 6 core with 3.6GHz base clock (and could be overclocked close to 5GHz because its an unlocked “k” variant). Your CPU is likely to be a limiting factor but your memory clock speed (and potentially a lack of dual channel setup) can also be serious issues.

There is often a linear correlation between FPS and CPU usage (given that a game can utilise all cores like Overwatch does). Measure you CPU load with a lower but stable FPS (e.g.: 60Hz) that your system can output and then you can estimate the CPU requirements for your desired FPS with this formula: CPU usage at low FPS * desired FPS / low FPS. However CPU and GPU aren’t the only possible bottlenecks.

3 Likes

90 fps is laggy what the hell lol , poor guy, his eyes can’t keep up with 90fps! don’t get a console, most are locked at 30 fps or 60fps you will go blind man! your poor mlg eyes

Any chance this guide could get updated? I can help if need be. There are just too many things in here that are not quite clarified correctly and very essential tweaks for performance that may not be common sense. I tried to get in contact with you, Wyoming, but I had trouble figuring out any way to.

Things such as this

SIM rate is the simulation or the amount of time your game client took to process a tick.

Is simply incorrect. SIM is solely your frametime, which is tied directly to your FPS. While the tickrate is indeed tied to framerate, this is only a concern under 64fps. All the SIM value tells you is your current frametime (how long it takes to draw a frame), 300fps being 3.3ms, 250fps being 4ms, etc. The goal of optimization is not to have the lowest frametime (though this is an auxiliary goal) but rather to make spikes and stutter in this frametime disappear. I have run entire QP benchmark matches at 4.0ms frametime with no variance when I was fully completely optimized. Truly heaven.

There are a lot of people recently trying to shill tweaks for cash (moreso in fortnite, but OW is victim to it as well), and I think having a good succinct guide with all the basic tweaks that are proven, at this point beyond a doubt (WITH SOME EXCEPTIONS BUT VERY FEW), available easily on this forum would probably curb some of that bs happening. A big update to this would draw attention to it too and maybe help some people who still haven’t gotten the memo on HPET or GFE or razer software that it might be causing your stutter more than it’s helping you.

However, I can understand the hesitation in recommending timer resolution changes or things like going further into RAM. You didn’t even mention XMPing your RAM. That’s really important, and a new user won’t know to do that by default. They might buy their gaming tier 4400mhz RAM and then end up running it at 3000 or 2400 or 2133 or whatever it defaults to, and you just paid for nothing if you play like that without realizing it. I was a victim of this so I know how amazing it was to go from default RAM settings to xmp 3200mhz (and then, I OC’d further manually). It basically made my OW experience go from stuttery and laggy to smooth and clean in a night.

If you are willing to chat about this and maybe a revision to this guide for current year, hmu on discord at Felicity#0001, or you can message me on twitter at @felicityful, but I will respond much faster on Discord. I respect the effort and time you put into your work, but we (myself and other intent optimizers) have gone further and deeper and found exactly what makes OW run poorly or not. I am very happy to share this information or at the very least talk over things that could be improved here.

Hopefully you’ll take me more seriously than the guy a few posts up that thinks being CPU bound is bad ^^^

6 Likes

What a fascinating post!

I’m sure many people here would love to see you make a more up-to-date guide about optimizing specifically for Overwatch.

God knows, many can benefit from it.

Cheers!

So much misleading, naive information.

3 Likes

Wow, I didn’t read the whole thing but I hope it contains a note to optimize your memory because that would make it the holy grail of bad advice

Here’s how to optimize overwatch:

Get a good CPU.
Get a good video card.
Get a good network connection.
Play the game

3 Likes

Is there any such thing as “frame of view” or was the original intent to write “field of view”?

Question If i Have MAX requirement of OW just asking OW2 will be similar ?