Overwatch 2 is unoptimized for PC

If you practiced reading comprehension, that was on a 1650. Now, with upgraded components, I just run it to vsync cap of 144 or 165. If I take the cap off, it will usually be around 200fps. On the 1650, it doesn’t matter that it was locked to 80 or not. It NEVER went below that and if I took off the cap, it would run in the 120-150 range.

But keep gasping for straws…

Or you’re just over reacting because you’re embarrassed about being wrong and being unable to cope with cognitive dissonance… Stop watching clickbait youtube videos made by people that have no actual clue what they are talking about…

No, the highest I’ll use is 144/165 because there’s no point beyond that and you’re wasting your money. Buuuuutttttttt… Since you want to bring it up… Let me tell you a cautionary tale of idiots that waste money chasing ultra high FPS…

I’ve talked about this some, a long time ago, but I’ve actually performed multiple controlled A/B tests on gamers to see what the cutoff point was in their ability to perceive frame rates. Talking even LEM-GE CS:GO players… Last I remember keeping count, it was a sample size of around 50. Spoiler alert: Beyond a stable locked 120 or so FPS, they could no longer discern any difference(50% accuracy meaning they were coin-toss guessing).

It was a controlled test that didn’t allow them to use any overlay statistics, monitor OSD stats, keyboard displays, etc etc(people that did it remotely would stream a feed of their screens). Basically, no cheating. If they weren’t in person, where I had a setup going, we’d first verify their monitor settings to make sure they could use higher refresh rates (at the time, 165 was about as high as people commonly had). The game was a basic template style UE4 FPS that was verified to run at no less than a certain cap on, even fairly weak hardware (would run in the 200+ range i3s and 1050s even). This frame rate max had to be able to hold stable while looking at a specific spot with some complex shaders and CPU math going on(a mini benchmark), to verify that even under worst case scenario frame drops, it wouldn’t go below a threshold, potentially contaminating the results.

The test used the most common FPS points like 30/60/90/120/144/165. The game would pick an FPS to lock to and then they’d be given 30 seconds to do whatever. Jump around, shoot stuff, blow stuff up, watch basic AI NPCs run around at varying speeds, etc.

After that, it would prompt them with two options of what FPS they thought they were getting. It would start off fairly easy like 30 vs 120, but as time went on, it would be 100 vs 144 and so on. Usually, right around the 100 to 120 mark, they would start answering correctly 50% (+/- 5%) of the time. The whole test would take about 30 minutes, so it was more than enough samples per person to obtain a fairly confident result.

Moral of the story: Stop falling for placebo, you’re lying to yourself. Without an FPS monitor running, you’d likely have no clue what you were actually at beyond 100fps (assuming no hard stuttering where it chokes for multiple frames in a row). But I’m sure you can definitely taste the difference between 10,000 dollar and 100 dollar wine. Surely the one with the bigger number is better and you can tell the difference…

1 Like

My that’s a lot of text to say absolutely nothing.

Also, you should look up cognitive dissonance.

Saying that the game is specifically made for console? the console port optimizatiozn is even worse, at least as a PC player you can figure out a way to improve it. As a console player, I can’t. We’re talking next gen. Playing on a PS5 and since the one punch update which is like 6 months ago, the game is unplayable, for competitve at least: framerate issues, audio stutter, double input glitches, weird animations, aliasing, slow rendering, frequent de-syncs and the screen shake that gives u motion sickness. I still play quickplay or mystery heroes, but the competitve integrity is long gone. If you want to compete above diamond, especially as a hitscan player, you’re going to suffer. I don’t want to compete in a game where the matchmaker is rigged and it’s a coinflip between who has less bugs and performance issues than the other…

You want to watch GPU memory junction temp and Hot spot temp these should be fine up to 90C (If you double click them you’ll get a graph… play overwatch and when it happens look at the graph)

(Not all GPUs have the same sensors so you might not have memory junction temp)

GPU Thermal limit is on the main temp and you don’t hit 83C on that.

1 Like

Tbh, I’m disappointed. Quoting something out of context is so meh. If I wanted to make a trolling argument I’d go down the route of bias and while it’s flawed at least it has the impact of being somewhat true and logical.

Either way, they explained the testing and how it was done, and no doubt it was done in the training range in ideal conditions. The issue is in the context of OW1 vs OW2 they have 0 reason to claim OW2 has lower latency so there trying to show the difference between cards.

Anyway, this can be debunked easily get a 60fps camera and measure the latency it’s not accurate but it’s enough to disprove their results … (You won’t) or you can get the test setup they used.

On release input latency for OW was around 70ms+
Later it got as low as 22ms
In OW2 it got as low as 8ms
And the same cards all had improved latency. (on the tests done)


This doesn’t disprove there aren’t issues on lower-end systems like 970 ect there could be cases were this might not happen… personally I don’t really care my point was to disprove this…

I’ll tell you what. Feel free to disprove me…

Just explain why fps and system latency are both improved a lot of systems for OW2.

or go on and smoke that “Anyone with decent reaction time and had played OW1 vs OW2 should be able to tell the difference in responsiveness between the two games.” copuim.

Sensors have a better reaction time than you and they directly disprove what you are saying <3


You brought up the eyes can only see X FPS meme… I shut you down.

I’m quite aware of the defintion of it… Here’s a quick wiki excerpt on it:

In the field of psychology, cognitive dissonance is the perception of contradictory information and the mental toll of it. Relevant items of information include a person’s actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things. According to this theory, when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent. The discomfort is triggered by the person’s belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.

You’re freaking out about what you subjectively claim/expect not matching up with the objectively observed reality(as objectively as reality can be observed, but this isn’t a metaphysics debate). You’re looking for any way possible to rationalize it: “The game is bad, input delay, lag, bad servers, etc etc” rather than accepting the fact that the problem is probably on your end and you’re just looking for a scapegoat to blame it on. You’re trying to bend things around to avoid facing the hard truth that the fault likely lies in you. So simmer down and practice self-reflection, like any normal grown adult would do.

LMFAO wtf dude savage af! xD

1 Like

If by “unoptimised” you meant sub-optimal, then I think you’ll find it’s not platform a specific phenomenon.

It’s the best Blizzard can manage sadly. Broke.

1 Like

I’m gonna give you the long response you don’t deserve and then I’m done with ya.

One: I’m not freaking out about anything. Two: The only one denying subjective and objective facts is you (and what’s his face). I was able to go from one game to the other and notice it easily. There’s no, “BuT muh data sez,” or, “But I did a shoddy control group where people couldn’t always tell what fps they’re at! This proves you wrong about game response time.” Yeah, Dauntless. Good one.

Again you keep trying to push this narrative that I’m attempting to make up for some perceived shortcoming. I’m not, it is now free to climb in OW2, easier than it ever was in OW1 (minus 5 stacking late night back in season 1 or 2). I adapted to Overwatch 2 as it is, which is a lesser state than OW1. Yes you can run in higher fps, but what good does that do you when you’re shaving off maybe 5 milliseconds but the OVERALL responsiveness of the game is worse? Answer is doesn’t matter at all. Netcode is worse as well. But I don’t have hard data on that so you’re free to plug your ears and cover your eyes and pretend it isn’t there, champ.

That’s just you this whole thread.

Excellent advice. You should take it.

Uh, what? How did you shut me down? What you did do is miss the point entirely. The meme is that there’s no point in having a higher refresh rate or higher fps because “the eyes can’t see past 30fps anyway” – which is idiotic. You took that to mean, “I should do a controlled study and see if people can really tell when they have high fps.” No one can. WHAT YOU CAN TELL IS WHEN A GAME IS MORE RESPONSIVE OR NOT. Like 60 to 120 hz refresh rate with matching fps. Big noticeable. 120 to 240 hz. ALSO noticeable. 240 to 360, less noticeable but still noticeable.

Again, you missed this point entirely. Why? Because you don’t know what you’re talking about. Case in point - you actually suggested having VSYNC ON and lowering your FPS a handful below your refresh rate as a fix to an old vsync bug. All that’s gonna do is lead to a bunch of screen tearing (unless we’re talking about having gsync enabled, but that’s a whole other topic). Again, any novice who’s built a PC could tell you that much, but somehow you missed it. Anyway, throughout this thread you drop these little crumbs that indicate you’re kind of lost in the sauce, dude, just like that one.

Yes, but the beautiful irony is that you don’t see that this pertains to yourself. That’s why it’s so funny and I was content to just let you keep doing it to yourself.

My guy. How about you just give me some links to these references? Ya keep telling me about them, I even googled for them, and you know what? I found some Nvidia Reflex latency numbers. That’s about it. Not what we’re talking about here.

But yeah let me go rig together a whole test and capture setup to capture system responsiveness as it translates to in-game performance to show the forum guy how he’s wrong. At least I’d have included links if I did. Oh, also the game I would need to show you you’re wrong doesn’t exist anymore. Fantastic plan, champ.

I don’t have everything I’ve read bookmarked sorry. The Nvidia numbers are more than enough proof when it comes to system latency has improved in OW2 which was my point.

Or maybe just use a phone camera to confirm the system latency is what Nvida is saying for your own education while 60fps can catch 33ms system latency at worse with enough testing you should be able to capture a click and shoot in the same frame to prove sub 16ms latency. Or don’t. I’m not fussed.

Either way, I’ll conceed I wasted my time when I could of waited for you to disprove yourself…

My point was I said you were wrong for saying this…

later you said this…

So I guess we’re done here.


I will also highlight this cause it made me chuckle.

Just wanna say. If you have Vsync on and have a lower fps than your refresh rate you won’t get screen tearing. Vsync double/triple buffers a frame. I guess your not…

Thanks for the answer. GPU memory junction temp does not exist on my GPU. However the peak temperatures I mentioned are what I observed from the graphs when playing some unranked matches. The FPS drop are not as much of a problem anymore but still happens sometimes. It was however a big issue in earlier bios versions so I’m pretty sure it’s a CPU issue and not that the GPU runs hotter due me upgrading to a faster CPU or something similar.

Yea if you can’t see the drops when the temps peak in Hwinfo that is most likely the case (which is nice).

As far as CPU goes the 5600 can handle 400-500 fps (if you have the GPU) so it’s not really going to be stressed by OW2. That said if something in your system is messing with things it can cause things to stall. The virtual memory/reinstalling driver/reset windows method does target that. You can try and set overwatch on to high performance too (not seen this work but other have). Might be worth checking Resizable BAR is enabled in bios too (long shot)

1 Like

Retail Overwatch.exe is set to high performance for the GPU but I’ve not tested resizable BAR. I’ll check it out later if not now. Especially if I have trouble with FPS in the game again. I updated the chipset drivers some days ago to a newer one and for now the FPS has been great if I remember it right.

1 Like

Ahh cool ^^. Well done.

1 Like

Go try it, Rez. I’ll wait. I don’t mean tearing, I mean stuttering. The point is that in no world do you want to run vsync and cap your fps below your refresh rate. It’s trash. Also, that isn’t how buffering works.

If your only point was to say that some of the OW2 engine is better in some ways than OW1’s engine, I guess that’s valid. HOwever, when we’re talking about how the game plays in general, the engine is worse. That was MY point which you know full well is what I was saying there.

As for your numbers, you haven’t provided any that are useful. You’re just throwing some out there and saying Nvidia told you so. Also, when you mean system latency, what are you talking about? SIM? What metric are you using that shows the OW2 engine itself is lower latency than the OW1 engine? Just give me a link, man, surely you can easily find this readily available data.

Why would I try it? I was pointing out how you insulted yourself and thought it was funny. I really don’t care also I didn’t say how buffering works.

My point was you’re lying to make a point which the burden of proof was me saying the OW2 engine is better in some ways which you now agree on.

As for the engine being worse or better.
Interpolation Delay is about 2-6ms higher
System latency is about 10-20ms lower.
Overall ping is about the same.
I’d say hit detection is as good as it’s ever been.
On these things I’d say it’s on par with OW.

Then when I consider it’s less CPU intense and better for fps (for me)
I’d consider the engine better.

The thing is for some people they’ll not like the OW2 engine cause they’re too used to OW1 and there are differences. I can see the argument that they never needed to change the engine now PvE is dead but we’re not really talking about how good an engine is anymore.

To me, all your argument sounds like you are used to getting Oranges and now you are getting apples and you hate apples so your telling me Apples are worse… I like apples but the reality is there both good.

You already said you found the Nvidia stuff. They have plenty of articles on how reflex work and how they did their testing. As for system latency, you can google it but it’s input lag basically or click to photon.

Now you say engine latency… you know engine latency could be defined by how fast it renders one frame and since it can hit 600fps that’s 1.6ms. It’s a dumb term since an engine doesn’t really have a latency that relates to user experience.

3 things that matter imo for user experience
System latency
Interpolation Delay
And netcode (shooter first things)

If you wanted to fit engine latency into that context it would be system latency.

While this is interesting, I’m just trying to point out you seem very confused if your asking about things like system latency and asking what is engine latency.

I’m not asking about any of those, I’m asking what metrics you’re trying to use because you never state it clearly. You just say lower system latency… and throw in some nvidia numbers. Also, what I can find are some Reflex promos and benchmarks that show lower latency with nvidia reflex vs not using reflex. In all cases, the latency there is more or less the same as OW1 as it is in OW2. Let me say it again, because you seem to have a hard time differentiating, nothing you’ve said is what I’ve found online despite your claim it’s there. You just keep ducking me when I ask you to provide a link. Which I figured would be the case anyway.

But that’s not the issue. Despite being able to get a higher framerate in SOME scenarios, the responsiveness of the game is worse than it was in OW1. You can say it’s my preference, but that’s not reality. Hit detection IS worse. It was also noticeable from OW1 to OW2 but at the time it was, “Oh it’s just a beta,” and then, “Oh it’s just launch issues,” and on and on.

Anyway, if your whole argument was a higher framerate, mission accomplished. Although, the only reason OW2 has a higher FPS is because cap in OW1 was set at 300 and you can go to 600 in OW2. Blizzard could’ve set the cap to 600 in OW1 as well (though it wouldn’t have been realistic at all at the time and a monitor with a high enough refresh rate didn’t even exist to take advantage of it). So I think you’re getting hung up on the notion that technically higher FPS is possible in OW2, but that’s not because of improved performance from OW1 to OW2 engine. It’s solely because Blizzard changed the upper limit to 600.

That wasn’t really what was meant by OW1 engine performs better in game than OW2. That may have been your lone sticking point, but it wasn’t the conversation. So well done there I guess. In almost every scenario, OW2 requires more system specs than OW1 to get the same framerate. That’s to be expected, but you seem lost about it.

As for confusion, I can’t tell if you’re actually confused and misreading these things or trolling. Either way, there’s not much else to talk about is there?

Your PC is old enough for a museum, thats why

Overwatch runs great on my 5 year old potato PC

Actually NO
I play on 176 FPS constent and 130 during fights
My PC is fine and I play on high ranks.
but im sure what you mean by your 5 year old pc makes good on your 60 fps and Your gold rank.

Thnx for dropping in tho