Rivals has it and it works pretty fine. A noticeable performance boost with no clear drawbacks.
Maybe if we pixel peep things we’ll find visual imperfections. But focusing on the gameplay leads to no flaws detected and barely any latency, if at all.
Rivals and OW are hero shooters. They don’t need the absolute precision of games like CS and Valorant.
Not really.
My experience with Rivals was good enough. No clear artifacts from the added frames.
Kinda true, but not on Maximum settings and not at higher resolutions.
Monitors these days have a high refresh rate.
Would be nice to allow for a high fps at the highest fidelity possible.
I don’t think it really makes sense to use in a PvP game. One, it adds latency to your input and two, it works by making up data. You really do not want to be presented with garbage information in an FPS game. You will not consciously notice it, but you might struggle to read enemy animations. Think ana melee vs sleep dart windup - commonly used in high level ladder play to defensive CDs from dive heroes. Frame gen might cause you to misread one animation for the other because it fed you interpolated / generated data.
On the other hand if you know this is a risk but just want a smooth gameplay experience, by all means go ahead lol
Giving the option hurts no one.
OW has countless play styles, from heroes that require precision and reaction time, to those who are basically spectating the game like Mercy.
i read that frame generation was a thing invented by developers so they wouldnt have to optimize their games so im kinda not liking the new tech. i mean it makes sense kind of and i have noticed a lot of games are unoptimized nowadays unless you have frame gen graphic cards.
sounds kind of lazy while increasing prices because hey look at this shiny new tech (that doesnt really help…)
Just means you aren’t looking lol. Painfully obvious blurry messes aren’t a replacement for real frames that convey subtle animation details.
You see, unlike rivals, OW is actually optimized and made by devs that have experience with the engine they’re working with (and are also not stupid enough to tie game mechanics to framerate unlike the rivals devs). OW runs fine, you don’t need fake frames to get playable FPS (120+) at a playable resolution (1080p) unless you’re using a literal potato to play the game on a 4k monitor.
yea i dont know all the tech mumbo jumbo stuff but i would think there is some angle by these companies to squeeze every penny out of gamers while helping developers make game the easiest way possible (especially triple A ones) while increasing their game prices as well
in other words, i suspect a scam kind of. maybe im wrong but i wouldnt be surprised
It wasnt invented for that purpose, but the problem is that a lot of devs will absolutely use it that way. When you get a “free” performance boost for doing nothing, it’s really easy to be lazy and let the hardware do all the heavy lifting to avoid the hard work. Many games already abuse this by requiring it to meet basic framerate requirements, which I do not think is satisfactory.
Tbh, both upscaling and framegen aren’t great more often than not. I would rather expect improvements on neural compression. Which preserves the image fidelity and can improve framerates due lower memory usage enabling higher resolutions.
Gpu compute power are usually cheap while vram bandwith often costly. On the old days we didn’t needed much vram and the compute power were low, now? We have the power just is too expensive to put memory to deliver at high res.
So, in that regard. I would rather focus on shader optimizations and neural compression. Instead of fsr/dlss/xess which generates a ton of artifacts and noise and framegen which provides something at the screen that doesn’t reflect a valid state in the game.
Cuda cores are evolving steadily but slowly. The gen over gen improvements are reaching a bottleneck due to slow fab progress.
AI cores represent a breakthrough in terms of possible improvements.
Having the ability to double and triple the rasterized fps at minimal cost (both power and fidelity) is definitely a game changer.
New 50 series cards are seeing some of the biggest VRAM bandwidth jumps in a long time.
When it comes to the actual image quality, I would assume frame gen will only get better with time. It’s already convincing enough if you don’t pixel peep and focus on the gameplay instead.
These AI shenanigans are even more crucial in termally limited systems like laptops, where simply increasing the TGP is not an option.