Actually I think it’s very impressive that they started out with a 32-bit game client running on DirectX 7, and that’s now evolved into a 64-bit game client running DirectX 12 with Ray Tracing, along with Reflex+Boost, etc. They easily could have just stuck with DirectX 9 or 10/11, but they chose to improve the game.
Diablo 4 isn’t exactly cutting-edge in terms of graphics or game engine design yet they somehow managed to shoe-horn DLSS into there. I’m still hopeful that DLSS will work it’s way into WoW eventually.
This right here is why I’m rolling around on the floor laughing at all the Facebook post of people claiming the 5070 will have comparable performance to the 4090.
An upscaled image along with AI fake frame generation is in no way comparable to the native 4K performance the 4090 delivers.
Don’t you know Facebook doesn’t fact check anymore? I’ll hold off judgement on the RTX 5070 until it’s independently tested but I do share your trepidations on its performance.
Oh it 100% isn’t delivering 4090 levels of performance. There’s no need to wait - you just need to take note of the “fine print” (although in this case it was spoken).
Using neural rendering and DLSS 4 we can reach performance levels that were only possible with an RTX 4090.
In games like Cyberpunk 2077 or Alan Wake 2, you can match performance numbers using DLSS 4 compared to an RTX 4090 using only DLSS 3.5 Frame Generation.
So when three of every four frames are artificial, instead of only two of every four, the two can match. Per their own charts, depending on the title, this is about a 70% boost on its own, which puts the 5070’s raw performance at around 60% that of a 4090. This is also without any indication of which level of DLSS is in use for the comparison, as several of the frames shown during the launch were noted as using Performance mode (half resolution) instead of Quality (2/3 resolution).
Honestly, reading the fine print makes the whole series look disappointing, particularly the 5090. They only reach double performance on any of them when using the new frame gen (+70% on its own), and even their Flux comparison requires using FP4 on newer cards against FP8 on older ones. When comparing the two like-for-like results shown, the difference between them is sub-40% - and those are the hand-picked results which are supposed to show them in the best light.
I disagree. The reason I do is because of a comparison to other systems and how they have been updated. In general WoW has always been behind the times on optimization after it’s very first year. The fact that Diablo 4 a brand new game is described by you as “isn’t exactly cutting-edge” shows what I mean. They skimp on the system they build everything on and it shows.
I’m not really convinced that DLSS will help wow that much.
DLSS is great for things where the hold up is waiting for the GPU to render, but not so great when the hold up is CPU side, and for wow, it’s very often CPU sided.
I fully agree that the CPU is the most important component for WoW, but there are still situations where extra FPS via DLSS could be put to use.
Resolution and refresh rate are huge variables. 4K 240Hz monitors are becoming more common and almost affordable. The amount of GPU power needed to achieve 240Hz at 4K would be vastly greater than what would be needed to achieve something like 144Hz at 1440P. Even WoW would become GPU limited at that point. I know that if I shelled out the cash for a 240Hz monitor, I’d want to actually achieve 200+ fps as often as possible.
My current monitor has a relatively high resolution of 3840x1600, at 144Hz, and I’m currently using a RTX 4080. I recently upgraded from a 5800X3D to a 9800X3D. With my 5800X3D I was seeing 60-65% GPU utilization in many cases, indicating that I was clearly still CPU limited. Now with my 9800X3D I’m seeing GPU usage peak above 90% more often than I’d like. That’s getting uncomfortably close to being GPU limited. And that’s with Ray Tracing disabled.
But I think that the biggest benefit would perhaps come from lower-end cards, like the RTX 4060 or future RTX 5060, where DLSS could make a big difference even at lower resolutions and refresh rates. The RTX 4060 has very similar performance to my old RTX 2080, and I was absolutely GPU limited with my RTX 2080, even back in Shadowlands.
True. but if you crank up the settings, even at 1080p, you’re going to be CPU bound before 240FPS. It doesn’t matter if the 5090 can render the 4k frame in 4ms, if the CPU can only tell it to render a frame every 10 ms, you’d still be CPU bound at 100FPS.
But the thing with the 4060s of the world is they are likely not pared with a 9800x3d, 7800x3d, or similar. They’re more likely to be paired with a 7600, 12600, core ultra 245 or similar lower price class CPU. So the gains from DLSS on the pure upscaling front will (i suspect) mostly come into play with laptops or other IGP setups.
With DLSS Multi Frame Generation you get up to 3 additional “free” frames for every traditionally rendered frame. The CPU isn’t doing any extra work for those additional free frames, they are generated completely by the GPU.
When Frame Generation was first introduced with the RTX 4000 series, the main downside was increased latency. But Nvidia has now created “Reflex 2” specifically to counter this extra latency. While this detail often gets lost with the overall DLSS hype, I think it’s huge in terms of making Frame Generation actually useful and not just a novelty/gimmick.
Well, as mentioned above, there is definitely potential to increase performance even in CPU-limited situations by using frame generation.
It’s also worth remembering that the fastest gaming CPU for WoW (9800X3D) only costs $500, whereas the fastest GPU, the RTX 5090, is going to cost $2000+. There will be plenty of people opting for the 9800X3D and pairing it with something like the RTX 5070 for budget reasons. Nvidia actually claims that the 5070 can put out more FPS than the 4090 when using DLSS Multi Frame Generation, but it will be interesting to see if that’s supported by 3rd party reviews.
Yes, there is frame smoothing tech, but these new AI frames aren’t based on new user input, and (assuming it is still the same base timeline as DLSS3), It has to render 2 frames, then interpolate the 3 between them, so your computer is essentially showing you 1 rendered frame behind where it otherwise would, resulting in input latency increases vs no frame gen.
It’s not going to be when compared like for like. Like for like, I’d be impressed if it’s noticeably better than the 4070 TI
Also, where is the demand for it from the public. Fact is this “hardware” forum is basically four or five people even less when Sal’s let his sub run out. Retail WoW is not a difficult game to get playable performance. Even the RTX 3060 will run it at playable frames and the highest settings (no RT) at 1440p. Yes, to push FPS you will need better hardware but the game scales extremely well and these forums prove that by the the lack of hardware performance issues compared to even ten years ago.
I would assume the majority of people play with hardware similar to the AMD 3600 CPU and Nvidia 3060 GPU. WoW has a good amount of people still on laptops and I kew of several people who were still playing with the Intel 7700k CPU and were looking at GPU upgrades instead of CPU. A lot of people simply ride with the CPU they bought in their pre-built system.
100%
Even Nvidia’s recently released marketing slides showed the 5070 nowhere near the 4090 in native rasterization although it was for one or two games. We will wait for professional 3rd party reviews to see how the card(s) really perform.
Anyone have thoughts on how the 5070 will compare in WoW to cards like the 4070 or 7800 XT? I’m assuming the DLSS hype isn’t even going to be relevant to our game at all right, so it’ll just come down to raw hardware?
Per NVidia’s official stance on the matter, the new MFG stuff doesn’t pace between two actually rendered frames - it’s worse than that. From what they’re saying (and I guess we’ll find out when the embargo is lifted as to whether this is true or not) it generates the additional frames (3 for now, though up to 16 sounds like the goal) completely open-ended, probably based on what’s been happening previously. This negates the additional input latency that regular FG would incur.
Given this requires software-level implementation, rather than driver level, chances are it’s going have at least some kind of feedback from the game. As such it should be able to respond to input during the artificial frames, and it may even be able to respond to actual game events that occur in that time. But they [NVidia] have already demonstrated that their FG handles large amounts of change poorly even when doing interpolation, so I can only imagine this is going to make for a smeared mess that ultimately makes the experience worse the more that’s happening - which is exactly when you want the higher framerate to begin with.
Yeah, the frame gen tech is very hit and miss. I don’t really notice it in Veilguard or Cyberpunk, but it’s very noticeable and distracting in Hogwarts Legacy and Jedi Survivor (though, we’ll see if the new transformer improves that vs the CNN it’s been using).
Most people assume the RTX 5070 will offer similar native rasterization as the RTX 4070ti but without 3rd party reviews no one can say for certain. Since WoW in its present state does not support DLSS that aspect of the card would not pertain to the game. That said, if it’s ballpark RTX 4070ti performance than it is more than enough to run WoW on your 1440p monitor maxed out and would go well with your 7800X3D CPU.
You have the best memory in Azeroth! Chromie wishes they could remember things as well as you hahah…
I poured through a lot of articles and youtube vids and actually decided on the Gigabyte 7800 XT Gaming OC 16GB. I know AMD GPU’s aren’t as popular with folks here but, for under $500, I feel this particular card is the most competitive one available right now for non-DLSS games.
I’m not really expecting a huge drop in prices over the next few months, maybe 5 to 10%, if that much; but I could be wrong. It wouldn’t be the first time (or the thousandth). Prices could also be unpredictable with the tariff talk that’s going on, which is part of the reason I went ahead and pulled the trigger early and ordered an upgrade.
I’m interested in the 5X series coming out, and it does seem like an amazing card, even if overhyped. Nvid’s marketing approach always seems to annoy people, including me. This time around, they’re touting the new gen’s base card performance with the previous gen’s flagship, which is a bit of a stretch and has a lot of people eyerolling. I get what Nvid is trying to say, but the way they go about saying it is almost sort of shady. It’s always like that with Nvid. Everything they say always makes me feel like I need to start researching and youtubing to try and figure out what the real deal is.
My upgrade isn’t due to arrive until next week, and I’ll have 30-days to decide if I want to keep it.
It’s an excellent card and will serve you will. AMD will releases their new cards in March. Nvidia RTX 5090 reviews just came out today but we still need to wait for the lower tier cards to arrive. Once those cards are out, the price of the 7800 XT will adjust in terms of performance to those new card prices but people always overstate what they believe to be discounts.