I managed to grab an RTX 5080… it’s seriously amazing what DLSS 4 can do.
I’m using a 4K 240hz oled monitor and with ray tracing ultra and maxed settings on Cyberpunk the gameplay is buttery smooth and with FPS stable at 240 FPS in most cases. Usually between 210-270 FPS depending on the area I am at within the game.
On WoW I wish I could say the same thing lol. In most cases my FPS is 100-200. Very rarely above 200…
Is DLSS 4 not possible on this game? I’m just curious since it would be an amazing in-game feature to have and it would boost FPS like crazy.
1 Like
Anything is possible, but even with all the graphical improvements the game has received over the years, the game engine is not GPU-based.
1 Like
just a question how much did you pay for the card op?
I bought it from the MSI store. It was like $1470 after tax… with free shipping.
people were lucky enough to buy it when it was like $999+tax day 1 but the cost on the cards went up of course. So MSRP is now like $1299-1399 in most cases now.
I know that’s expensive but considering that the 4090 has been selling for years at $2000+… I’m getting better FPS than my brother who has a 4090 on games with DLSS4 like Cyberpunk. My 5080 runs better than his 4090 and on 4k.
thats good if you said any higher i would post the bender gif laughing lol
IDK, I would be posting the gif at msrp, let alone 50% over. That said, dlss works by rendering at a lower resolution and then upscaling it, but I guarantee right now if you lowered your in game resolution, you would see almost no performance gain.
i have a 1080ti and saving for a 4090 atm till the end of the year and see if i can get a 50 series. rebuilding for the new battlefield game lol
I went from a 1080ti that i bought used for $500 to a 3080 for $700 when evga finally got them in stock
1 Like
for years i had a 1050 after my first computer i build then 4-5 years ago i got the 1080 ti lol.
and the 10 series are losing updates down the line dont want to take that chance lol
Just keep in mind that all forms of DLSS frame generation add latency. While they’ve taken steps to reduce that added latency, it still adds a relevant amount compared to not using frame generation. I’m not sure that’s actually worth it in any situation where you’re already getting 100+ FPS, especially as the developers have doubled down on making the primary challenge of high-end PvE content a reaction-time battle against one-shot mechanics.
And good luck with that OLED. I actually caused part of my WoW UI to become burned-in to my older VA panel, a monitor tech that isn’t even really supposed to be susceptible to burn-in, unlike OLED. I knew then that I could never go OLED. It’s also funny every time I upgrade a computer with an OLED monitor from Windows 10 to Windows 11 and then point out to them how you can still faintly see the old Windows 10 icon in the bottom left. “Now you know what burn-in looks like”.
I initially had a gtx 570 that died and when I rmad it to evga they sent me back a 670. Then when the 700s came out, I bought a used 670 and ran them in sli until getting the 1080ti. IDK, anything over $800 for a gpu feels like scam unless you’re gaming in 4k.
That was a clickbait rumor based on changes in CUDA developer support. Those changes make sense because non-RTX cards have different capabilities as far as CUDA goes, but it actually had nothing to do with driver support. That connection was pure theory. Also, cards still tend to get multiple years of “security driver updates” after they are pushed off the main driver. Cards like the 1060 are still among the most popular. I wouldn’t be too worried in that regard.
Both the 4090 and 5090 run Cyperpunk RT at about 30 FPS @ 4k without DLSS… and framegen doesn’t eliminate latency.
Not sure about your claims. Definitely not sure about your claims of buttery smooth when your latency is that of 30FPS.