any FPS over your monitors refresh rate get dropped anyway
so It won’t matter if your DLSS or whatever can make 600 FPS if you don’t have 600Hz monitor
So if you monitor is 60Hz, and DLSS or whatever can get you to to 60FPS you are gold.
120Hz monitor? then you only need 120 FPS
excessive FPS just overheats the card and can damage it.
so yea, what ever tech gets your FPS to match your refresh rate is fine.
I’m playing on a ASUS Proart 329CV. It’s a 4k 60Hz monitor.
My RTX 3060 12GB runs fine on Display port 1.2 in WOW and every other game I play
Most of the new GPUs aren’t game centric. They are mostly focused on AI stuff.
That $5,000 5090 might get you a 30% increase in game performance over the 4090, measured in FPS only, it might not.
Gotcha thank you. Indeed, am planning to for a minimum 240 hz refresh 4k monitor. There is an influx of 4k, 240 hz monitors and even 5k2k 27 inch monitor.
Too many choices (GC, monitors etc). Not enough understanding. Hence the post.
you’re not going to damage your card by running at 100%, yes, you are correct that frames over refresh rate are lost (unless by doing so you smooth out the lows), but lets stop spreading nonsense that using our hardware in the way that it is rated for is going to somehow “damage” it
while yes the number of images displayed per second is limited to the refresh rate, to suggest it makes no impact at all is wrong. Rendering out a more up to date picture increases responsiveness and smoothness of motion by having more stable frames to pick from.
Locking your frame rate is a good idea, but not for the reasons you stated. Instead, if you have 60hz, try locking to 80 or 100 fps. MUCH better this way.
People have been asking about DLSS for a long time now. There has been ZERO hint by blizzard that they are working on this.
But really, you don’t want to deal with the added latency from frame generation in a game like WoW. Even the steps that Nvidia has taken to mitigate the extra latency don’t fully compensate for the extra latency that is added. WoW DID add Reflex+Boost, which is arguably more important when it comes to what matters.
Just because you have a 240Hz monitor doesn’t mean you need to be pegged at 240 fps constantly (especially at the expense of latency). Just make sure that G-Sync/Freesync are properly configured, get the fastest CPU that you can (so that you can get as many real frames as possible), and you should be good to go. A 9800X3D is cheap compared to something like a GPU upgrade and is absolutely the best thing you can do for WoW.
GPU over-heating out of unnecessary component strain is in fact a thing. Even if it’s not damaging the GPU directly, it can cause damage to other components present.
You even acknowledge the necessity of the software created to attempt to mitigate that in thermal throttling.
Thermal Throttling on GPU quite literally exists to attempt to prevent damage via heating. Mind you: Not all cards have thermal throttling.
You can’t simultaneously insist a problem isn’t a problem while acknowledging the system created to attempt to mitigate the problem.
The rest of the conversation is whatever… But you’re using the evidence of necessity while trying to dismiss the concern under non-necessity.
Both of the technologies referenced in the thread title, DLSS4 and FSR4, include frame generation, and the person who made the thread was concerned about getting 240fps on their 4K monitor (typical use case scenario for frame generation).
If you want to create a tangent about older versions of DLSS or whatever, that’s fine, but it’s pretty clear what this thread is about. Maybe try reading it before replying?
You need to do heavy loads 24/7, such as a render farm, to have any measurable impact in that way. It’s just not a reasonable concern when we’re talking about running wow at 100-200 fps on modern hardware.
my 4070 TI is barely using up 25% of its total processing power at 100 fps. I don’t think I’m going to damage anything or heat the card to ludicrous levels.
if your gpu heating up is causing issues in your case, you have other problems.
the point of the gpu is to run at or near 100% when playing a game that needs it.
Both technologies refer to upscaling, not frame generation, frame generation is separate thing within those card generations (and something that must be separately used by the developers)
nothing in the post would indicate the op was talking about frame generation (which is once again separate from DLSS and FSR), more likely he would like a smoother gameplay experience by rendering the game at 1440p and upscaling to 4k
Frame generation has been part of the nvidia line since the 4000 series. That doesn’t mean this post is talking about frame generation, nowhere does the op mention frame generation and dlss3
the main feature games use and likely what would be put in to wow if ever would be scaling, not frame gen
maybe dlss4 forces frame gen since its the only way nvidia can claim performance gains like they do, just another reason to stop supporting nvidia
Now assume that most people playing WoW are either children, or college kids on a budget not able to sit on a $1k+ GPU.
Or you have a budget GPU, integrated graphics, or are on mobile/laptop GPUs?
Like… Top-end financial investment to GPU based gaming is not the standard nor expectation. There is a significant portion of people who play WoW because it’s older-GPU/budget-PC friendly.
I’m not going to start looking down on/negatively judging people because they can’t financially afford a $3k+ entry point just to have a conversation about thermal throttling on gpu power consumption and unnecessary utilization of frames output versus frames rendered.
Anyway, like I said - You proved your own argument wrong initially. I just wanted to point that out, so you stop this misnomer argument of “that’s not a real thing, we made a software about it so that it can’t happen because it does happen but it’s not a real thing so it can’t happen even though it happens.”