That’s the one I bought just prior to me upgrading my GPU. It was actually a spontaneous buy.
I walked into BB for something and decided to browse the monitors in the shelf, including the ones on clearance. I had wanted a G-Sync monitor that’s at least 1440p, IPS, 20"+, and at least 144mhz refresh for the longest.
I was thinking about the 27" until an in-store employee came by to assist me and told me they had a 32" version that wasn’t on display, with only one more left because they were selling like crazy. Not wanting to pass it up, I made the purchase.
Good thing too; when I was making the swap, one of the two monitors I was using had actually died when I tried to give it to a friend of mine so they could start using two monitors. Had it since the late 2000s .
The only reason HDR on monitors has any value to me is because it increases the chances of the monitor’s max overall brightness being high enough for the screen to be useful in a well-lit room.
The display on my work iMac hits 500nits which is great for this, but most external monitors cap at around 300-350 nits which makes glare a big problem just with indirect sunlight bouncing around the room. This is one of the things that led me to get an AW2721D, which is DisplayHDR 600 and thus caps out at around 500-600 nits. Even if the HDR on it is near useless the brightness is nice.
I get you. I hate seeing reviews where they talk about calibration, and in order to get perfect calibration they drop brightness down to 220 nits or something.
While that might be best for color accuracy, for me, it’s literally “I can’t see **** captain.”
So I tend to run my displays at 100% brightness and contrast anyway, despite effects on color accuracy.
But as far as that goes, my office location is kind of in a cave with do direct and minimal indirect sunlight.