In the server-slam beta, HDR ON worked great for me on an LG OLED.
Since I got LG C2 42", playing games became a neverending joy of eye candy, especially for games that natively support HDR. But D4 is the first game I encountered where HDR support is straight up bad. Like others say, it looks washed out / greyish / foggy. HDR games are supposed to look crisp, with true black, normally invisible shades of dark, etc - Diablo fails to achieve that.
Pretty sure this is not the devās fault, they just havenāt had time to add proper HDR support yet, whatās with the release rush. Iām sure theyāll fix it with time and then itās going to be gorgeous. I canāt go back to an SDR display, not ever - (true) HDR is that good. But gonna give Blizzard time to add proper support.
It looked somewhat better than in the first two betas, but black/dark was still greyish.
See for yourself the difference. Here are two screenshots I took from the Server Slam Beta last weekend.
HDR OFF:
https://i_imgur_com/8fIFllD.jpg
HDR ON:
https://i_imgur_com/VCB8dgZ.jpg
(^Replace the underscores with dots).
Thatās with ULTRA settings and DLSS Quality at 1080p for both.
You can put backticks (on the tilde key) around an URL to make the forum let it through, like this: https://i.imgur.com/8fIFllD.jpg
, https://i.imgur.com/VCB8dgZ.jpg
. Alternatively, you can surround a block of text with triple backticks to make pasting several urls easier:
HDR OFF:
https://i.imgur.com/8fIFllD.jpg
HDR ON:
https://i.imgur.com/VCB8dgZ.jpg
Traditional JPEG (which is what you linked to) doesnāt support HDR. What youāve captured is different from what was really on your video output - itās a dumbed down version, somehow converted from HDR to SDR.
I havenāt yet found a way to capture HDR screenshots (on Win 11). Iām using Win+Shift+S for the built-in snip tool, but that clearly canāt do HDR because the resulting screenshots look overexposed in photo terms.
So, a proper HDR-screenshotting tool should output a file format that supports HDR, which is one of (quoting ChatGPT here; I manually verified some of them, should be ok):
- OpenEXR (.exr)
- Radiance HDR (.hdr, .pic)
- TIFF (.tiff, .tif)
- JPEG XR (.jxr, .hdp)
- PNG with HDR metadata (using certain extensions or metadata formats)
So just be aware - if youāre making or opening a JPEG screenshot, it doesnāt contain any HDR data, itās an old school SDR image.
There are so many back and forths to this that Iāve concluded the following:
- At the end of the day HDR is a preference, some like it some donāt.
- The type of monitor you have matters, they all have different HDR limitations (400, 600, 800, 1000, etcā¦)
- Even if you use the same HDR limitation, but on different monitors, the differences could be drastic.
- Due to it being a preference, some prefer built-in HDR modes, while others would rather use the system offered by Windows OS.
- For some bumping up the āvibranceā which even non-HDR monitors can do, could give more appealing colours.
- Increasing contrast can also give similar effects.
For me personally, HDR doesnāt overcome the appeal which I can get by tweaking colour balances on my monitor despite having an HDR monitor. HDR for gaming is just one of those things that are there for a talking point, but arenāt required. HDR for movies is a different story.
Anyways, like I said at the beginning, it is a preference, so long as you like what you see and you can see what you have to, you should be happy.
It was generally the same when actually playing though.
With HDR disabled, the image was mostly washed out looking. With it enabled, the image had really nice colors with rich darks and shadows.
I could probably manually adjust the non-HDR settings image to make it look less washed out, but this is just using the default color configurations for both and not manually tweaking anything.
And yes, these are just two JPegs from video captures I did in Windows 11 while gaming last week.
HDR itself isnāt a preference, itās a superior tech with no drawbacks. The only debate here is around Diablo 4 at this moment not having it well implemented, but they will fix it.
The type of monitor you have matters, they all have different HDR limitations (400, 600, 800, 1000, etcā¦)
Basically there is fake HDR and true HDR. Fake HDR usually comes with the āHDR 400ā label and is cheap. True HDR is āHDR 1000ā. You can take a look at the best gaming HDR displays here: https://www.youtube.com/watch?v=Qtpbv8HUrtE
Even if you use the same HDR limitation, but on different monitors, the differences could be drastic.
Not if you get True HDR.
Due to it being a preference, some prefer built-in HDR modes, while others would rather use the system offered by Windows OS.
In Diablo 4 yes, because the native game support is poor. But when it gets fixed, it will be incomparably better than Auto HDR. Auto HDR is a ābetter than nothingā duct tape solution for older games.
For some bumping up the āvibranceā which even non-HDR monitors can do, could give more appealing colours.
Increasing contrast can also give similar effects.
Big misconception there. HDR displays have several physical (not a matter of boosting anything on the GPU side) advantages:
- Showing absolute black next to superbright white which is physically impossible on non-true-HDR displays.
- A side effect from the black/white thingy is that HDR-1000 displays can simultaneously show you many many shades of dark while also having lots of bright areas in the scene, all at the same time - which SDR monitors are physically not capable of doing.
- Wider colour range
Some SDR displays do have a wide colour range, and I still have one of those - its colours look oversatured and unnatural, even distorted and eye-bleeding at times. This is nowhere near what a True HDR display gives you. LG C2 gives such a perfect picture it feels like youāve minmaxed life.
If anyoneās really curious before buying one, visit a retail store with HDR displays on the show. Not the fake HDR-400 ones, but the good ones like LG C2. The image is mindblowing, while also supporting 120Hz and G-sync. Iāve been using LG C2 for half a year now for gaming - this is the next gen and the difference between SDR and True HDR is miles long.
It tried both extensively and kept the LG. Itās subjective. Both are great.
Same and they both look fine but ultimately a 42ā monitor seems absurd, particularly for competitive FPS and RTS games where glancing at the minimap every 2 seconds is required, itās tough to do without moving your head which is very annoying, unless you put the monitor several feet away from you but gaming on a TV has always been weird for me for that very reason, because then itās hard to do anything else on it, or you need a desk that has a 4 foot depth. Just found that initial comment a bit weird in the first place
Even at 34ā I feel like itās slightly too big to be a gaming monitor. 25-27ā is ideal if youāre serious about RTS/FPS/MOBA games that involve a minimap.
It seemed intimidating to use a 42 inch OLED as a monitor at first but I adapted really quickly. I play Overwatch 2 and Starcraft 2 on a 42 inch screen all the time with no problems.
Iād suspect most people playing those games look at the map every 30-60 seconds, which is like 10x less than a higher level player will. It gets old. Not sure how often itās a requirement on OW, but DotA itās every few seconds for example. I wish my screen was slightly smaller, because if you have to turn your head even slightly, thereās no point in a bigger size. On games I can increase the minimap size like DotA it isnāt as bad, but then I sometimes click the minimap when Iām trying to click the map.
On games you donāt need the whole screen, particularly the bottom edges, itās great!
I was a slightly worse player in Rocket League at first when getting used to playing on LG C2 42" with 100cm viewing distance, but now im an even better player.
Still, would love to have a 32" 4K OLED with a clear glossy coating and perfect R-G-B subpixel stripe.
I know at least 2 apps: Xbox Game Bar (Win + G to open app, and default screen shortcut is Win+Alt+PrtScr) and Nvidia Shadowplay.
I used both and for me Xbox Game Bar is clearly the best. I see the HDR output (.jxr) exactly like I see it in the game (there are some exceptions in some games when calibrating the nits) and it also creates an SDR output that is a really bad quality one.
The best to obtain the SDR capture when in HDR mode is to do a conversion HDR to SDR with Pictureflect Photo Viewer, as it does and excellent conversion.
Nvidia also creates an incorrect SDR capture when in HDR mode.
As @Ethreix told you, these were no correct captures in order to compare SDR and HDR.
Here you can see a real comparison (also in same scenario, very important to compare same things):
SDR (second beta): https://i.imgur.com/Sca0i7P.jpeg
HDR (third beta): https://www.file.io/gpfW/download/FIoz4cnrz8oU
It is also very exaggerated how bad you look in SDR compared to HDR.
It looks like you have misconfigured the SDR brightness in HDR mode. Here, you can see more info if you check the problem āAll standard dynamic range (SDR) content and apps appear too bright or too dark on an HDR-capable displayā:
https://support.microsoft.com/en-us/windows/hdr-settings-in-windows-2d767185-38ec-7fdc-6f97-bbc6c5ef24e6
Anyway I donāt quite understand why to see the game in SDR in HDR mode. If you want to play SDR you need to set both game and Windows in SDR (so, HDR mode OFF) and if you want HDR you need to set HDR in both game and Windows.
Another thing is Windows 11 autoHDR, but we are not talking about that.
The latest beta improved the HDR (black levels were grey in second beta) but it was still not right in third beta (slam). I checked it personally and there are youtubers specialized in HDR that said the same: https://www.youtube.com/watch?v=A_dznCuudwc
We are talking about testing with TVs with good HDR, I canāt talk about TVs/monitors with questionable HDR quality (maybe not your case).
How can I enable it? I have an LGC2 too, but I cannot tick the Option. Is at least the Game ready driver needed or can it be only used, when it is already activated in Windows?
In BFV or Destiny 2 I can select it ingame without problems.
Iāve tried both;
- Enabled HDR in game
- Disabled HDR in game but Win 11 Auto HDR enabled.
I prefer option 2. Almost true blacks now but not perfect.
Late to the party, and someone may have mentioned this, but I resolved that gray tinge by reducing the windows SDR content brightness setting to about 25% (Windows Settings > System > Display > HDR > SDR Content Brightness Slider). I have a 3060 TI hooked up to a 65" C2. Not sure what size your panel is, but Windows uses default ini files for the monitor drivers on any tv Iāve connected it to ( tested on Samsung QN85A, QN85B, QN90B and LG C2, All 65" panels). In every case, Windows has assumed a peak Brightness of 1400 nits. Peak Brightness on a C2 is roughly 800 nits, though.
give this a shot on a C2 (or other tv) connected to a Windows PC and see if it looks any better for you:
Windows Settings -
SDR content brightness =15%-25%
Auto HDR off
Diablo 4 settings -
HDR on
Brightness sliders =
Black point = 0.1 (canāt set full 0)
Brightness = 300 - 350
White Point = 800-1000
Then switch your GPU color output between Full and Limited (10 bit color) to see what works better. On a C2 , my guess will be limited, as the Black point is true 0 on an OLED screen.
Not sure what you like, but if you want hdr and better lighting without the greyed out, washed out look, I have a suggestion.
If you turn on HDR when D4 is loading up, the opening screen before you see the big red Diablo 4, HDR is on with full bloom color saturation.
Not necessary but if you want that extra color pop that seems to fix the washed out look. I am on PC and use Alienwares OLED HDR AW3423⦠w/e The G sync one.
If you turn on HDR in game you will get a washed out color tone. I was trying to figure out wth was going on with HDR. It must be some kind of bug.
If your on console I am not sure.