I use AMD’s counterpart, Radeon Image Sharpening.
If what I am reading about it being pretty similar, it uses post-processing effects to sharpen the image.
You can almost make a downscaled image look like a higher resolution image with just this alone, and it can be a lower impact on performance than the actual resolution would.
Example 1080p w/sharpening looks almost as clear as 1440p w/o, but only costs 3-4% performance vs. the 33% hit it would be to render native 1440p. Those are made up numbers and obviously there’s some differences in quality, but it’s a simple example.
If you haven’t tried it, give it a go. Some people like blurrier images and some like sharp. I’m a sharp kinda guy. So I use RIS in every game, and in others I use Gshade to get the sharpening effect. Although Gshade effect is stupid high performance hit.
RIS is like 1-2% and Freestyle sharpening is somewhere around 3-4%. But on something like an RTX 3080 that’s not even gonna sweat it.
I leave mine at native resolution and apply the sharpening filter. It also lets me use the cheaper AA, FXAA High, as it reduces some of the brute-force smoothing effects of that type of AA on images but still lets FXAA do its job on jaggies.
I can’t speak for Nvidia version, but hopefully it’s similar.
The Freestyle app lets you do all sorts of filters (black and white, film grain, etc.) but the link in the first post is how to turn just the sharpening effect on in global settings.
I don’t know that it’ll show up in screenshots. It might, but there’s a good chance it won’t.
To prevent issues with frame dependencies (effects that need the existing, unaltered data - like WoW’s reflections), these kinds of filters tend to run in a separate buffer that doesn’t get captured with screenshots. I believe control panel-based FXAA is the same.
The overall effect is pretty good in my experience, though. The nicest part is it also works on textures, which increasing the resolution can’t do. And, yeah, it takes some of the blur out of post AA.
But I would reduce the amount. I think it defaults to 50%, which is starting to get too sharp (it produces halos in places of high contrast). I lowered mine to 30% and it seems pretty good. It’s also separate to the GPU scaling now, even if they share a control panel entry.
Can you tell me what your settings are? I’m going to play around with it after work again tonight. In the Nvidia control panel, both sliders are 0 - 1.0 with a checkbox for GPU Scaling, and then also what are your in-game settings that impact this? Thanks!
Also: How about the other AA options in the Nvidia control panel?
I use 30% (0.3) for Sharpen and the default 17% (0.17) for Ignore Film Grain, though I can’t even think of the last time I played something with a film grain effect that I didn’t disable. Scaling is disabled due to having two screens (I avoid changing display resolution whenever possible since it’s still not a great experience).
In-game I just leave at 100% Render Scale in Windowed (Fullscreen). Lowering it I have considered but not played around with yet. It should show similar benefits to the control panel-based resolution scaling, but I’d be using the in-game scalers (bicubic probably being the best for up-scaling) instead of the 4-tap scaler from my Pascal. Perhaps I should check it out sometime…?
Control panel AA I have set to Override at 8x, which I then use the NVidia Profile Inspector to set to 8x SGSSAA for transparency. Most modern games will have profile settings that override due to incompatibilities so performance doesn’t tend to suffer on anything which would be noticeable, but it makes many older things look pretty sweet (for laughs I once played Darksiders with 4x DSR and 8x SGSSAA - effectively 32x the resolution I was able to actually display. Pointless since it looked no better than just 4x SGSSAA, but at least now I know that for sure).
For WoW (which does override my “Override” setting) the SGSSAA setting makes nameplates and some spell effects look a lot nicer without having to enable the in-game Alpha Test setting. The Alpha Test should do the same basic thing only faster, but it makes some graphic elements disappear entirely (like the statue you have to pull the corrupted eyes off in the Vale sometimes, or many of the hut edges in Zangarmarsh).
It works well in pretty much everything, but I should emphasise that “pretty much” bit if you’re planning on setting it globally.
Some 2D programs - Twitch, Steam, GoG Galaxy, Battle.net, etc - develop some pretty funky display issues due to how (in)frequently they update their display. It’s good for performance I suppose but it means the picture gets sharpened, then re-sharpened, and re-sharpened, and so forth, until it looks like a dotty mess, shortly before it all returns to normal and starts again.
So if you’re going to apply it globally like I have, you may want to take the time to individually tweak profiles for things like that so that they have it disabled. It’s generally also a good idea to disable forced anisotropic filtering, ambient occlusion, and/or anti-aliasing in such applications while you’re at it, should you have them set globally as well.
PS Control panel Ambient Occlusion should be disabled for Blizzard titles. It’s supposed to work but it produces artefacts in every single one of them in my experience (D3 shows tile borders, SC2 has texture depth issues, WoW gets dark marks, etc).
Do you guys have Triple Buffering enabled in game or in the control panel? My monitor is gsync compatible, but I haven’t used the feature and have been playing around with various settings. Any helpful hints regarding the various sync settings in the game and nvidia control panel? I also have my in game FPS capped at 144.
Control panel triple buffering is only for OpenGL. Doesn’t do anything at all otherwise.
Using TB and GSync in tandem is superfluous. The idea with TB is that it can render ahead of what you’re seeing if the rendering rate and the display rate don’t exactly match in order to maintain more even framerates in the event of a momentary hitch, but GSync is intended to keep them aligned until you’re outside of its maximum range anyway.
Best case scenario, TB will do nothing when used with GSync until your frame rate passes your maximum refresh rate, at which point it will just start adding input latency. A tiny amount if you’re talking a 144Hz screen, but still present.
Fast Sync, however (control panel Vertical Sync option) is a slightly different beast. It creates two additional buffers that get rendered to alternately until the refresh, at which point the most recent one is displayed. It’s intended to keep the input latency down to one rendered frame, rather than one monitor refresh, such that you have the clean picture of v-sync with roughly the same latency as no sync.
Again it’s largely superfluous with GSync enabled or a high refresh rate until your framerate passes the refresh rate. But when it does it can use a lot more power to save you a few ms in response time. Pretty much only really useful for benchmarking (since it can score the same as no sync) or very low refresh rates.
Adaptive (control panel again) is, in my opinion, the worst of all words. It will dynamically choose between vsync on and off based on your current framerate - if you’re at or above your refresh rate it will sync, and below it won’t. So you either get tearing or a capped framerate. Again it’s superfluous with GSync, but it’s pretty garbage without it too.
Don’t use global SGSSAA. It’s never been officially supported (hence why it’s only accessible via the 3rd party NVidia Profile Inspector) but I’ve finally traced it down as being responsible for an issue with a number of “modern” apps, including (but not likely to be limited to) “Your Phone” and “Windows Terminal”. The apps still run, but parts of the window won’t be visible (title bar text, some menus, some dialogue boxes, etc). It doesn’t happen immediately (which makes it a pain to diagnose) but as soon as you reboot things will start going AWOL.
You can still set the other AA options, including setting Antialiasing - Transparency to any of the options accessible from the normal control panel. Just don’t use the SGSSAA options. And you can still do per-title SGSSAA.
I tried to individually disable SGSSAA in just the affected programs but I can’t find the process responsible for actually rendering them. It seems to be detached at least somewhat from the executable and quite possibly runs inside a system process instead.