Yea they are.
Huge surplus of inventory means amazing sales and discounts soon. They gonna huuuurt!
Yea they are.
Huge surplus of inventory means amazing sales and discounts soon. They gonna huuuurt!
I keep capped at 165 but my client in ow2 was going 240+ in the beta with a 1660 superâŚso much overkill honestly
Fun fact it actually was on the geforce now beta, and got removed by blizzard half way through it
Yeah I was devastated when they removed it
This stuff always cracks me up. Feels like the people that buy these cards for actual new graphic intensive games are super niche. Most people buy these cards to get max fps in 10+ year old games.
Worked with a guy that bought a 3090 and only plays wow+TF2.
It actually helped me a lot when i had a potato PC
Got a proper PC 1 month before they removed it from the service
Dude im literally on the same boat. Got a 1070ti back in 2018 and play OW at 175hz with my AW3423DW at 1440p. I will probably get a 3080ti or 3090ti sometime towards the end of next year.
I have a 3080ti with a Ryzen 6-core and it runs Overwatch at 4K 120FPS no problem. If OW2 is more demanding Iâd likely just run it at 1440P which looks just fine scaled down on my 4K screen.
I have a feeling itâs not going to be an issue though.
The 4000 series cards run hot and power hungry⌠I donât need a space heater.
I expected more tbh. Also, this ms latency is a bit high also.
Just confirms that both OW2 isnât optimized and rtx 4090 werenât meant for gaming in general. Although their omniverse and AI appears to be great.
4 times rat traycing performance and 2 times more rasterization gives like 2-3 fps on cyberpunk without dlss on, comparated to rtx 3090ti. While dlss 3.0 is an improvement. I donât see much raw performance on the card itself aside dlss gimmick.
Feels like, they put a ton of stuff but something isnât in sync or their latency increased a ton because of it. Either way doesnât appear to be that appealing for gaming. For work and productivity feels great tho.
Iâm using a 27" MG278q and honestly the 1070ti holds up pretty well⌠but Iâm looking at things like Star Citizen, The Isle, Red Dead 2 on max in 2k and streaming⌠but Iâm gonna need a new cpu too. Everything else in my PC is good to go.
I play in 1440p and was planning on getting a 4090, so this is a feelsgoodman.
I also have a 3080 but Iâm tempted by a 4080 ⌠Iâll just wait to see the prices.
I had the chance to pay 900 dollars only for my 3080 but I know itâs gonna be pricier to get the 4080 in 2022âŚ
At least US citizens get a good price. I recently ordered the latest Iphone and itâs 400 dollars more expensive in my country than in the USAâŚ
3080âs in Canada are $1000 now.
Regarding iphones, my company gave me an 11 for my 10 year so thereâs a buck saved <.<
Once you hit 144fps anything extra doesnât make a significant enough difference to upgrade unless you have superhuman vision or something
Wasnât Overwatch always easy to run? I mean I already get 300 FPS on OW with a 2016 GPU.
Itâs like bragging about your CS:GO framerate.
I forgot to say on the previous post but:
We had AMD FidelityFX⢠Super Resolution (FSR) 1.0 on OW2 beta. So, I donât see them implementing anytime soon. Maybe later down on the road, but I wouldnât expect anything less than 3 months.
Although maybe they use FSR 2.0 on launch, dlss is too âearlyâ and thereâs no confirmation about compatibility with 20 and 30 series also.
hmm dont think ill get a 4000 series graphics card since my 2080ti is fine. maybe around the 6000 series lol⌠or whatever the future cards will be called. maybe by then 4k will be the norm and 8k the new goal post
still they have to find a way for graphic cards to have more power w/o consuming so much energy. that new card looks like a monster and it guzzles up so much energy. wonder how big they can get âŚ
LETâS Fn GOOOOOOOOOO
According to the Nvidia site this isnât just render latency; itâs total system latency.
Which still is huge comparated to alternatives at half FPS even if is âsystem latency itselfâ. The game itself has their own SIM which account most of it already. GPU Radeon measurement on their own software at 400fps is roughly 1.8-2.5ms, which weirdly matches with SIM 1-2 SIM metrics roughly 0.3-0.5 below the 3rd SIM.
Nvidia reflex already drops to reasonable SIM values on OW, even if is unstable as heck. Showing those values, are expected to be comparated to SIM values.
The SIM values on overwatch are highly linked to framerate but also the gpu delay accounted on the sum, which SIM often has 3 values accounting those delays. You can have higher fps on Nvidia cards with also higher SIM values than AMD counterparts if you keep reflex off. While having Reflex on you get better lows but the average often a bit worst than AMD, reflecting on more erradic/jittery behavior due the variance itself. Nvidia api has way more overhead than AMD one due the nature of itâs own arch. One of the things that made them slowly adopt dx12 and shift their software to work slightly better on it, still on âlatencyâ sensitive scenarios often have a bit more latency overall even if their lows represent less than AMD.
Also, considering that most of their videos about RTX 4090, often without dlss 3.0 the gpu perform roughly similarly to rtx 3090ti. Meaning that the âdoublingâ on certain stuff maybe also gave more input lag/delay on their own pipeline which can explain why the dlss 3.0 are more aligned with the bigger OFA and Tensor cores space allocated on those gpus. Making dlss 3.0 a bit of âmustâ have to get those appealing framerates. Maybe thatâs why their performance itself outside the âdlss 3.0â performance werenât as impressive as they âtried to showâ, I mean 8-15fps more than RTX 3090ti on flight simulator without dlss enabled isnât that âhuge dealâ considering they âdoublingâ most of their processors. Maybe drivers arenât that optimized or maybe they designed the gpu to work with dlss on most of scenarios.
Also, aligns with their goals of omniverse, remix and popularization of Ray traycing features where they have some âUpperhandâ but also motivate devs to work towards dlss 3.0 and neural AI.
Their values for comparison are equivalent to 8k-16k max 180-200fps rx 6900xt lc(oem), which would be roughly a bit better than stock rx 6950xt due itâs clocks and memory bandwidth. Also the gtx 1080 has better input lag at 1080p, not at those settings tho, at roughly 350-400fps.
At similar framerates the rx 6900xt(LC OEM) at 1440p gets roughly 0.2-0.3 more on lows and about 0.8-1.2 less spikes, which gtx 1080 have their SIM all over the place, both at 400fps by example.
If your framerate isnât steady that woud reflect on poor experience, which cleary on their video had. Their Neural AI and DLSS 3.0 are impressive, but their promisse of rasterization and ray traycing performance arenât impressive as they are quoting, paired with dlss 3.0 performance sure is great, but aside that not that much. At least right now with their videos and settings.
they doubled and quadrupled several stuff on their gpu, also increased roughly 30% the power consumption of it, by the same time they cut from 8nm to 4N. So, most of their âgainsâ on density in fact reflected on higher consumption due the increased OFA and Tensor cores, which would play a major role on Omniverse, Remix and DLSS 3.0
Their Ray traycing performance and Rasterization performance werenât impressive on none of their sources, at least yet. OW2 video reached good fps values but those values arenât that hard to be achieved on current gen, on beta with rx 6900xt(LC OEM) I got roughly 320-350fps 1440p at max settings 150% render scale(which the video didnât quoted what kind of render scale they used), while on OW I got roughly 235-285fps at 200% at 1440p. While 4k(180-200fps), 8k(140-160fps) and 16k(100-120fps). Sadly I didnât tested on gtx 1080 ow2, but I expect slightly better performance than ow1.
DLSS 3.0 are impressive due reduced strain on CPU, but also increases latency by using features similar to triple buffering. How much will increase we will only know when gets released and if the game support it.
Cool tech for serveral stuff, not that impressive for gaming. At least not right now. I sincerely hope Iâm wrong and they simple are holding their cards on hand, but so far. Their 2-4x marketing are misleading at best similarly to appleâs one.
They do good stuff, but their marketing are often misleading. I expected improvements on raytraycing tech and rasterizaiton performance, but so far Iâm more impressed about their neural AI advancements, which is pretty cool even more with dlss 3.0 speeding a ton of it. Aside that the card itself doesnât appear that impressive.
Later check flight simulator without dlss performance and compare it with rtx 3090ti, you will be suprised how âfewâ their performance improved. Same goes for Cyberpunk 2077. Overwatch2 felt about 100-150 fps improvement from 300-400fps range, but both OW2 and rtx 4090 arenât available for further testing doesnât sound that impressive at all.
Is a improvement from 30 series, how much of it will benefit on games only time will tell. So, far not that much. Nvidia do a good stuff, but also âoversellsâ their products, similarly to apple on that regard.
Maybe I grew skeptical from nvidia since rtx 20 and 30. But not impressed on gaming, although dev/modding/omniverse would be amazing for sure.