360+ FPS 1440p OVERWATCH 2 - powered by GeForce RTX 4090

Sim is only a tiny fraction of total system latency. Sim measures how long it takes the game to process a server tick and directly correlates with framerate, with min sim at 400 fps being 2.5ms.

Total system latency starts the moment your mouse receives an input and will be closer to 50+ms. It can only be evaluated on a monitor with the integrated Nvidia reflex analyzer, so very few players know their actual system latency. Reflex itself is an amazing feature but it doesn’t impact sim; its benefits occur before tick processing, so sadly you can’t view its latency reductions without reflex analyzer.

At 2:00 in this vid, you can see KarQ testing OW1 on a 360 Hz monitor. His average system latency is around 40ms, though his sim will be closer to 4ms based on his framerates.

If Nvidia is truly delivering sub 10ms system latencies, it’s kind of mind-blowing. I’m skeptical tbh.

Been waiting for these 1440p 360hz monitors for months now. Hope this means they’re coming soon :slight_smile:

I feel like my 3080 will run the game just fine honestly lol.

That is a pretty serious upgrade…

1 Like

Lots of 108TIs that people are hoping will last another 5 years too

F Nvidia

I’ll keep my 3080 for now, the game runs fine at 144hz / fps.

This video is kinda scammy as hell.

360+ FPS is much more dependent on CPU and is possible right now without a 40 series card.

Latency numbers, sub 10ms is amazing and compared to …

is a massive improvement… but… this is more likely to do with the upgraded overwatch engine because when you look at other latency numbers the improvements aren’t that big…

Apex
3080 : 19
4080 : 12

Fornite
3080 : 19
4080 : 14

Source Nvidia reflex numbers…

Honestly, the way a I see this video … AMD and Intel make the CPUs that make these framerates possible and Nvidia is taking the credit
OW2 engine has been improved and latency is reduced… Nvidia is taking all the credit.

Honestly, I wouldn’t be surprised if a 3080 or a 3070 could give similar numbers in the same setup with a latency of 12-15ms

So, you’re really thinking they shrink from 50ms to 5-6ms? I highly doubt it although I think is possible about 8-10ms within certain circunstances. Also at 400fps your each about 2.47-2.53ms, about 394-396 you get 2.6 at 412 about 2.3 and 423 about 1.8. I measured on the radeon adrenalin on the second screen, the game didn’t “Handled” the extra frames for whatever reason but registered the 1.8 on SIM at 423 accounted frames on adrenalin and 2.47-2.53ms at 400fps also. That I reached on RX 6900xt LC(OEM) with fast timming memory. As reference this model is slightly higher clocked version of rx 6950xt with the 18gbps output and stuff like that.

Weirdly enough Nvidia Reflex affects the SIM value on gtx 1080. Maybe on newer doesn’t but in there does. When enabled makes the gtx 1080 SIM values behave oddly btw, with huge spikes and huge drops. Also cutting roughly 30-40% their OG values for whatever reason.

His framerate on 360hz was like 260-280fps. He would decrease visual latency on Monitor not exactly PC itself. If you see the hud you will notice:
fps (273fps)
Render Present Latency (1.2ms)
Render latency (0.8ms)
Mouse latency (2.6ms)
PC+Display Latency (22.9ms)
System latency (25.5ms)

In game SIM would be about 4.6ms under 273fps. At 400 is 2.3, a bit lower than 400 2.5, 423fps about 1.8ms. So, SIM = both renders(1.2+0.8)+CPU(2.6ms). His measurement were: SIM+MOUSE+DISPLAY

Also when he changed at 71fps (reflex off to on) he left from 21 render latency to 15 render latency. Which would effectively affect the render latency and his SIM.

SIM often is about PC Latency on Reflex, which is part of the calculation. Which would be in KARQ’s case 2.6+1.2+0.8 at 273fps. Meaning 4.6ms PC latency(CPU+GPU). Add the display you get +18.3ms on top of it the mouse about 2.6ms reaching the 25.5ms value final or parse of 22.9ms without mouse

If on their rtx 4090 recording they reached 550 would mean about 1.2-1.4ish + mouse which would reflect their mouse in about 3.1-3.3ms. aligned with most mouses They showed 5.5-9.6ms oscilating a ton during the gameplay. My guess they simply picked the SIM+mouse(2.6ms). Which could be near the measurement I had. I got 423fps reaching 1.8ms in game. Which 438 they got 7.2ms and at 366fps got 9.6ms.

Their video:
at 506fps has 5.8ms
https://www.youtube.com/watch?v=lbcYFgQJOLM&t=8s
at 485 has 5.6ms
https://youtu.be/lbcYFgQJOLM?t=9

My guess the discrepancy is about the mouse being wireless and the pc having more “delay” on cpu side because of it.

5.5 - 1.6ms(SIM) = 3.9ms mouse (Roughly 505fps) then, the display about 3.2 leaving again to 8.7 ms pc latency.

So the mouse most likely would be:
https://www.rtings.com/mouse/reviews/roccat/kone-pro-air
due:
https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/ada/news/play-competitive-games-with-rtx-40-series-and-reflex/nvidia-reflex-new-latency-analyzer-mice-september-2022.jpg

They simple are using “Mouse+SIM on the top left”
https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/ada/news/rtx-40-series-graphics-cards-announcements/geforce-rtx-40-series-nvidia-reflex-sub-10ms-system-latency-latest.jpg

I found the mouse of the pic:
https://www.rtings.com/mouse/reviews/logitech/g-pro-x-superlight
3.1ms(mouse)
400fps = 8.6ms-3.1 = 5.5ms which would be about 2.3 SIM (pc latency). Leaving to 3.2ms the display:
https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/news/reflex-low-latency-platform/nvidia-reflex-end-to-end-system-latency-terminology.png

Their claim about 8.6ms system latency can hold the ground if you consider SIM as PC latency, add mouse and monitor latency.

basically:
3.1 ms from mouse
2.3 ms from cpu+gpu
3.2 ms from display

8.6ms system latency claim

So, they improved the cpu+gpu time with more fps and improved the display due the frequent 360hz+ delivery.

My bet:
PC latency = SIM
System latency = Display+SIM+Mouse

which would reflect their 8.6ms claim of system latency and in game 366fps got 9.6ms. 9.6-3.1 = 6.5ms (SIM would be about 3.3ms) leaving the display again 3.2.

The margin error most likely would be about 0.2-0.3 to SIM/Display on the wired mouse and on the wireless one on cpu due the “receiver”.

That’s my guess on the situation

1 Like

I think that’s what they’re claiming but I agree that it’s hard to swallow. This is no doubt a highly optimized machine that’s been tweaked specifically for minimum latency. The display itself is also probably faster than anything on the market now.

All your numbers look reasonable though, so their claims could be legit! I have to wonder if the real star of the show in their chain isn’t the GPU but rather the display itself. :thinking:

Wow that’s interesting about the 1080. I actually had one and never noticed this.

That’s awesome! Did you edit the config file to bypass the 400 FPS cap or are you using some Radeon feature (I know nothing about the current AMD cards :laughing:)? I’ve never gotten lower than 2.5 sim (rounded) because of the cap.

I’m in the process of tweaking my system now for latency reduction. I bought a 360 Hz monitor without reflex analyzer because it was on sale, and now I’m regretting it. :sob:

1 Like

At 1080p, definitely. But 1440p in teamfights? I’m not so sure. That’s a lot of pixels.

I’d love to see this tested, but it would definitely have to be 1080p for comparable performance results.

It really looks like Nvidia is actually false advertising this generation blatantly

For example: Let’s make a 4070 class card. Change the 7 for the 8. Price go up 400 dollars. Don’t advertise on the box the shader counter for maximum effect.
Also, don’t forget these are founder editions with fixed MSRP, which will probably disappear in minutes. So add 150-200 for 3rd parties versions.

I play with a 6700xt at 1440p, capped a 240 just because I don’t like fan noise when gaming. That thing can go easily 300+.
(5800x, 32gb).

Also, dlss 3 only available in 40 series for 25 games? When fsr 2.0 works in every game and gaming card on the market.

Just wait for amd presentation and hope they haven’t become a new Nvidia and gives us price competition.

They claim a 4090 is anywhere from 2-4x performance compared to a 3090Ti (they need to back this up)… so the pricing would be sort of valid. Real market price will probably be beyond 2k due to mining though.

I want to see how they work for rendering and other work stuff. I’m on a 1080Ti currently.

Ethereum is dead nowadays.

New gen prices should be just new gen prices, not based on the previous one performance.

If a flagship card was always price X. Next version should be around that price (maybe a bit expensive due to inflation, etc) but not this outrageous pricing Nvidia (and AMD last gen) are forcing consumers to accept.

I’m glad everyone will get to see that when OW2 launches on October 4th with their still unreleased and $1,599 RTX 4090 for sure. The majority of us will absolutely have that gaming experience, definitely.

this is such a low bar it’s quite sad

I can get a packet from software running on one machine, through a switch into another machine and into the software in about 2us

takes 5000x longer to get mouse input to the screen??

I can get 400 fps occasionally at 1440p, on low graphics of course. :wink:

DLSS 3.0 interpolate between two frames to generate a synthetic frame, causing 1 frame of lag and some artefects. I don’t think it is worth it for a competitive game.

the prices for the 4 series are insulting

Nvidia is planning eventually to exit the PC gaming business, and they plan to make as much money as they can on the way out. They see PC gaming going to cloud-based service (which they happen to sell) and they want to focus on AI-based chips since the world will transition to AI in-everything before long. Much better market for them, and the PC gaming is now just an annoying distraction for them. I would not be surprised if the 5000-series is the final wave of consumer oriented cards from Nvidia.

I got a desktop with a 3080 and a laptop with a 3070.

I couldn’t care less about the performance of the 40 series.