How many more years before 5K gaming takes off?

So I picked up a deeply discounted refurbished iMac Pro for work from home purposes (my day job is Android + iOS app dev, the latter of which requires macOS), and the 27" monitor built into it looks incredible. With specs of P3 color, 5120x2880 resolution, and 500 nits of brightness it makes my old 1440p IPS panels look solidly mediocre and TN panels I’ve previously owned look like garbage.

It only has a Vega 56, so it’s not really a gaming machine, but if I bump down a few settings to mid (SSAO, liquid, and shadows) it can run WoW at full unscaled 5k at 60FPS and it looks great. With this pixel density, antialiasing isn’t needed at all. Everything is razor sharp, even text and objects way off in the distance.

But if I look at the industry at large, monitors seem super focused on framerate above all else. Brightness, color, viewing angle, and pixel density among other things tend to take a backseat to framerate and sometimes wider aspect ratio, meaning that if you’re looking to game at even 4k your options aren’t nearly as robust as those for 120/144/240hz 1080p/1440p gaming. 5k gaming doesn’t exist at all, with the only 5k monitor being publicly sold being an LG-made Thunderbolt monitor marketed toward Mac users.

But it isn’t as if 5k panels are new. Apple started using them in iMacs all the way back in 2014. They’re not even particularly expensive (tons of standalone panels are on eBay, brand new, for ~$400). They just never took off elsewhere in the industry.

Is this more of a function of limits of GPU power or an indicator of the more popular types of games? I get the appeal of high FPS in competitive stuff, but that’s only a single facet of gaming. In less-twitchy games like WoW, Witcher III, Horizon: Zero Dawn, and Breath of the Wild, framerate is more of a cherry on top and a panel that’s highly competent in other ways is nice to better enjoy the scenery in them.

If graphical power is the problem, the muscle that the next generation of GPUs bring might fix that, at least at the top end. What do you guys think?

You’re looking at a fairly undemanding game when you look at WoW.

Regarding games that CAN be played at 4k+, you’re looking at a big of a paradoxical issue.

On the one hand, non-graphically demanding games tend to be Esports games. Esports games tend to be focused on competitive gameplay. So even if they can be played at 60fps at 4k+, people generally won’t.

For everything else that is fairly graphically demanding, 1080p is still the realm of entry level hardware, and 1440p is generally reserved for mid-range GPUs. 4k is for high end GPUs.

At that point, it becomes very cost prohibitive and most people will choose smooth gameplay vs. visual fidelity.

That makes sense. Makes me hope that the next couple generations of GPUs bring such massive leaps in power that something like a 2080Ti comes to be considered “entry level”.

I think the issue also is that there’s a “standard”.

For the past 10 years, 1080p has become the standard. Slowly, 1440p is becoming more mainstream. Eventually, this will become the new 1080p and being able to play games at 1440p/60 will be the new “standard”.

We will need to meet that milestone first before we meet the 4k milestone.

Ugh, so it’s going to be like how 1366x768 has been haunting laptops for over a decade and will be another 5 years before we even hit 4k standard.

Well, I’m not so sure I’d say that.

Judging by today’s games, you can expect a to be able to play most games on medium or high at 1440p on a fairly inexpensive graphics card, say a 1660 Super which is only around $230.

Entry level cards such as the 1650 Super and 5500XT, which are aimed towards the 1080p/60 standard, still exist, but more and more lower priced cards are able to perform quite well these days, and the trend seems to be continuing.