4k is ruining gaming

I’ve been getting frustrated at the state of gaming and tripple A developers, hardware manufacturers. Going to vent some of my thoughts

In my view 4K can take a large portion of the blame.

For PC, most people game in 1080p. That is just a fact. Steam Harware Surverys for Apr 2023 says 65% of people are using 1080p for primary monitor.

4k and 1080 are indiscernable for most people, especially at distances people sit when gaming on a television.

The performance and vol memory cost for 4k is very high. File Storage is just stupidly high, and it is totally unsustainable. Even google cannot handle it, as they are considering putting 4k behind a paywall to recoup storage costs (in some countries it already is).

Game studios are incentivized to optimize for 4k since consoles use that target resolution. Sony and other TV manufacturers desperately need 4k games because without that, there is no 4k content for thier dumb products.

Games are by far the biggest producers of 4k content. Network TV does not shoot in 4k. Only some movies have recently started, but not all. You have to go to a specific section of Netflix to find the 4k movie options. I don’t think it will ever be ubiquitous like 1080p is today.

There is also a marketing benefit for stuidos as they can claim their game is capable of the latest and greatest. Stuidos would get skewered if there are any unflattering visual comparisons made of their game (craig from halo infinite anyone?)

Nvidia and AMD were under enormus pressure to create 4k capable cards, and they have done it! What did it cost? Well now mid teir cards are now 800 dollars and people are not going for it. Most PC gamers again, are using 10XX or 16XX Touring series from Nvidia to this day (steam survey). The 1650 is the most popular card according to steam april23.

So why are games costing so much to make, trying to push a level of fidelty and technical complexity that majority of players do not want and cannot hit.

The gap between the most popular cards (1060 / 1650) and the newer enthusiast cards 40XX is insane now.

Its a trap for studios! if you optimize your game for the latest and greatest hardware, it will look great but not many people can play it. On the other hand if you optimize for the low-end, it will age faster, may be out competed, and you cannot market the game by saying it has the latest features (4k, RTX, DLSS)

4k just sucks

That’s a roundabout way to say most people have bad eyesight. But i digress.

I disagree with this, since many people will tell that Image will be a lot more sharper and clearer, especially if you sit close to it. Ya put an image of a 3d Apple in both 4K and 1080p, they will say that one is blurrier then the other.

Then just settle for 1440p. You don’t have to go to 4K.

2 TB of SSD costs about $100 or under, which is on average 2 weeks of part time minimum wage.

Were not up in the TB’s just yet for games. Heck, most aren’t even in the triple digit GB’s yet.

I would rather they optimize the console games for 60fps at 1440p the very least and have that be the standard their suppose to shoot for.

Well mainly because games aren’t really restricted to whatever resolution they started at. If i really want to with some finesse and tweaks, i can play The Sims 1 in 4K. It’s unnecessary, but hey, it can be done natively.

Halo Infinite was “skewered” for their unflattering visuals. It was the problem with the graphics itself, and not the fact it wasn’t running in 4K.

I mean… they already do create 4k capable cards since 2017, perhaps earlier. Their just now polishing up the Ray tracing and adding in more ways to make it more sharper while balancing the performance with the DLSS or AMD equivalent.

…Except you don’t have to go there. You can just settle on 1080p or whatever you like. Hell, you can put the game on 4:3 if you really want. I don’t get why people think that you HAVE to run the game at best most optimal settings with the latest tech available. You can literally run it however YOU want it to. That’s the beauty of it. I’m still rocking 1080p, because i choose to. If i want to, i can upgrade to 1440p and give my 3090 a bit of a workout and some gaderade.

Consoles meanwhile have all the grahpics like low for performance or balanced, and have everything med-high at the sacrifice of framerate being chopped in half. The only upgrades you can do to it is change how much space you get, which… that’s not even a new thing for consoles since the days of the PS2 with screw in Hard Drives. Infact, out of ALL the consoles that ever had upgradable parts, N64 with it’s Expansion Pak is the most innovative. Basically giving that system more ram to use for better graphics. Like, why is that only a one time thing? I digress.

1 Like

It’s not about my monitor preferences, this is about how it impacts development. Agreed sitting very close you will see a difference, but it is not like 480 p-> 1080p, it is marginal, and to some people, imperceptible.

One of the main points of the OP was that it is unreasonable for developers to have to get thier games running on 1060s while still being able to push the boundaries for 4080s.

The only reason the 4080 cards exist is because of 4k. At 1080p and some games 1440p
most newer cards are mostly CPU bottlenecked. So if GPU manufacturers want to release new products every year they have to push 4k cards.

Now 8k is being thrown around in marketing, just look at AMDs RX 7800 XT showcase. They actually pitched it as an 8k gaming card… what a joke.

This whole cycle is unsustainable, if it continues low/mid tier gamers will be priced out of the hobby at the high end (AAA).

4k cost is a real problem. Just look up youtubes 4k storage and paywall, yotube has an issue with balloning storage costs.

I am playing Jedi Survivor on PS5 right now. The only reason I bought it on PS5 was because I thought my 2070 super was not going to be good enough to run the game stablely.

I play in performance mode, it is super blurry and choppy, even in perf mode. Why? Some scenes look horrible. Nearby textures are constalty blurred, partially loaded in or popping in. This is PS5 don’t forget, so you would think it would be well optimized, not like PC where different hardware configs present a challenge for optimization.

It does not have to be that way, turning down the resolution will allow the game to run natievly without blurry upscaling, but thats not an option.

I mean, the game industry has been going towards graphics for a while, since 64 bits became mainstream.

The difference is equally, if not more noticeable.

…That’s what a lot of devs over the game industry been doing for a good long time.

Crysis at the time, was built to push the boundaries of PC graphics at the time and the strongest card of 2008 was AMD ATI Radeon HD 4870 1GB. It can still be played at lower settings though. Meanwhile something like The Sims or WoW can be played without so much as a GPU, but it still would benefit from a high end card of the time.

…But 4K existed for awhile now.

AAA gaming is a dang joke. So i’m not too worried about them pricing out low and mid tier gaming anytime soon.

I get it, but my answer to that would be “Don’t get 4K”. Or settle for 1440p. or etc.

I don’t know why Youtube is trying to lock behind 4K a paywall, i doubt it will work out for them long term.

On a whim, i’ve decided to find a benchmark for 2070 for this game, and this is what i’ve found.

4K on optimal settings don’t seem to half bad. Could be better. But for 1440p and 1080p, it seems to do better.

Consoles. That’s why.

Their very limited in hardware and the hardware they have is already outdated by the time it hits the market.

Actually, i don’t think that all. Especially with games like Forspoken running 720p at 30fps on the PS5. Consoles rarely run well since the Ps2 days honestly. And with no real graphic options to speak of on consoles as we do on PC, they will always run sub-par until then.

1 Like

Agree to disagree I guess. Happy to have the debate though.

Maybe in a few years there will be some new technology which trivalizes the performance and storage cost of 4k and I will be happy. Hopefully we never see 8k.
Where we are now, the industry had to invent a janky upscaling tech which looks awful they could sell you just so the thing would be playable (DLSS, FSR) Sidebar: did you know an average 90min movie takes a 477GB bluray?

Seems like a lot of people don’t notice the difference, just a quick scan of Google Scholar, and I have not read the full study just the abstract so… don’t skewer me please :slight_smile:

2016 IEEE study - 54% of time participants could identify UHD content… so about half

https:// ieeexplore . ieee . org/abstract/document/7498935/figures#figures

One of the biggest tech youtubers, LTT, talking about Youtubes 4k problem. He goes over some of googles storage and cost data.

https : //www . youtube . com /watch?v=MDsJJRNXjYI&pp=ygUINGsgbGludXM%3D

There is whole lot of misconception in the original post and thread but what the OP stated is not really true. 4k is not the reason why video cards are so expensive. In fact plenty of developers will go back to even the 90’s and make their old games compatible with 4k.

Console game developers checkerboard (think FSR or DLSS) the snot of of their games to achieve 4k so they are not natively running at 4k to begin with. Now game developers will include include PC graphic settings (hello RTX) that push video cards beyond what most are capable of today.

So I think the real question is why is their such a large jump in the past year of hardware demand to play games at PC ultra settings and why are video cards so f@#$ing expensive. I could write an entire article on each point but I’m just going to point out the issues.

Why the large jump to new hardware?
Developers have stopped making games to run on the previous gen consoles. New consoles were developed with large publisher game engines in mind so games are optimized for them. These new consoles have more RAM, 16GB of GDDR6 ram that is shared between video and system and developers are taking full advantage of it. Consoles have dedicated decompression hardware where as the PC relies on the CPU. Console games are run at a mix of what PC gamers would call low to high settings (high on the things they do well and low on those they don’t) while the PC master race only believes in Ultra settings. In fact most PC games run just fine if you adjust the settings to what you have. PC games are often rushed out the door to hit a target sales date and get poor optimization. And finally like I previously stated, console games often use checkerboarding while PC games are often under pure rasterization.

Why are video cards so damn expensive?
Covid, not the only reason but you had a situation of high demand created from miners plus even more demand from people sitting at home on top of that and limited supply. Ultra high demand meets very limited supply, prices sky rocket. With pricing sky rocketing, companies like Nvidia liked that…they liked that a lot…they really, really liked that and so did their shareholders. By Nvidia’s own admission they are keeping prices high. “Lately, game consoles are selling for about $599. And the reason for that is because it’s more useful than ever. You use your gaming console for your greatest form of entertainment, and you use it for a very, very long time. And GeForce essentially is a game console inside your PC. And we’ve always believed that the ASP of GeForce should drift towards the average selling price of a game console. And so it should be something along the lines of $500 or so roughly at this time.” - Jensen Huang

So Nvidia clearly wants pricing to be high. In fact they leaked RTX4060ti pricing at $499 and people complained, they leaked it at $449 and people complained and now they are leaking it at $399. That’s a 20% cut in gross profit yet the AIB partners will still buy the chip so you know there are some good margins in there for them.

What about AMD? Clearly AMD knew something was up by sticking so much RAM onto their cards (they do make the SoC for the consoles after all) but they seem to like the high margins as well. They have shown little desire to lower prices and eat way at Nvidia market share sitting at their current 20-25% share comfortably each year.

Intel? I believe Nvidia actually fears Intel. Intel is ruthless (as are Nvidia and AMD CPUs) and has money to burn. You see Nvidia working hard to differentiate themselves with DLSS and RTX (in fact they dropped all GTX cards) forcing people to get cards with RTX support even though Nvidia knows full well the cards can barely support them.

So it’s not so much 4k as it is greedy companies that have pushed mid tier cards to double in price and removed any form of the sub $200 entry level card. This is occurring in other industries like automotive too so it’s not just GPUs.

1 Like

Hogwarts Legacy just released on PS4 :smiley:

@ 900p and 30FPS with draw distance set to nothing and minimal texture resolution

great if you own the PS4

they will still release game on the previous gen consoles, the are just focusing their attention and builds for the new consoles

1 Like

Actually its not. This hasn’t been true for about roughly the past 5 years.

Various sources and web sites show what resolution most people are playing at and its 1440p. Although the gap between the people playting at 1440p vs 4k shrunk dramatically this past 1-2 years. These results even blew me away.

What ruined my life and pocketbook gaming was trying out OLED Gaming. :rofl:

8k’s an inevitability.

But you are correct that 4K and 8K is being treated like a fad though, i will agree with that.

Ehh, the implementation could be better admitly, the upscaling is meant to make things more clearer.

I actually didn’t know that.

YouTube SHOULD charge for 4K. Hear me out. " by Linus Tech Tips. …Hmm. :face_with_raised_eyebrow:

I’l save that for my pocket to check out once my Dislike Extensions work again. (it just broke somehow)

Edit: 35650 dislikes to 114325 likes, a 70% ratio. Hmm. :thinking:

So this one is tough but the majority of PC gamers are probably going to be 1080p because you have to include laptops and 1080p monitors still outsell 1440p monitors. Now if we are talking about people who post on gaming forums, DIY builders, and tech enthusiasts you will see 1440p, widescreen, 4K are far more prevalent among people and the majority of users are past 1080p gaming. One quick stroll in a microcenter and you can see the floor space they still give to 1080p monitors. Part of the problem for many casual PC gamers is the current cost of upgrading to a card that handles 1440p gaming.

1 Like

Was going off the website that track people that are doing builds and Gamers doing builds. Not to mention that Steam tracks this stuff from a lot of their Gamers too

So they primarily track desktops. You are 100% correct once you factor in laptops it completely changes things

This is completely true but you also have to factor in the fact that covid changed a lot of this. People that were building second computers and decided a lot of times should just spend a little more since they were doing a build anyway and get what they had always wanted.

The astronomical amount of people that gave into scalpers proves that alone.

for good reason you should not know that, a 90 min movie in 4k should be around 21GB

single layer blue ray holds 25GB
dual layer between 50-66GB
triple layer 100 GB

2 Likes

Yeah I think we just found a bootlegger LOL.

Sounds like he’s referring to compressed downloaded Blu-ray rips. Which are nowhere near the quality of the Blu-ray itself because they are compressed

1 Like

I’m guessing Repo either is talking about uncompressed or how much the source files are in total. Hence the bigger numbers like that.

Doubt it.
Uncompressed would be absurdly big. Even with chroma subsampling to 4:2:0 color space, you’re looking at about 1.5TB for 90 mins of video.

Sounds more like someone ripped a BluRay disc and transcoded to some absurd (read completely pointless) quality level–although, even at the worst settings I’ve ever played with, I don’t think you could reach 277GB for an hour and half of movie.

For 8K video, I can kind of see that…but then I think you have to use one of the not-as-well-supported-profile-levels (6.x) for H.264, not sure which encoder can handle that (and I sure as hell don’t want to try that using a SW only codec).

1 Like

Appreciate all the discussion everyone!

According to steam 1080p is the highest portion of gaming monitors out there.

Steam hardware Survery for APR2023 says 65.54% of primary monitors are 1080p which is a change of +.024 month on month.

https :// store. steampowered .com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Everyone can view for themselves, but this is just one platform, steam, but I think it would be pretty representitve of the pc gaming community.

For a lot of people it is going to be cost. A 4k monitor is more costly, means you need a 4k card, bigger power supply.

For some people its preference, like me. I would prefer 1080p 240hz something that is going to be rock steady stable high speed.

I think 4k is a big marketing scam. I agree 4k clearly offers a better image than 1080p. My sweet spot would probably be 1440p, although the industry seemed to skip 1440p. You cannot find any 1440p TVs and your consoles don’t target that resolution.

I am not arguing that 4k is not better than 1080p. It looks better, clearer, agreed there.

The argument and thesis of the OP was to say it is destroying gaming by creating such a large performance range between the high-end and low-end, ballooning development costs and storage costs, making optimization extremely difficult or impossible. Finally, it is setting us on an unsustainable path down the road to 8k which is just stupid. 8k is way past the point of diminshed returns.

Cheers Friends

That’s great, options are always good as some people prefer 4k 60hz, 1440 144hz, etc., etc.,

TV’s and monitors are not the same. I’m not sure how old you are but we had 16:10 (such 1440x900, 1680x1050, 1900x1200) monitors prior to 16:9 taking over. The old CRT monitors offered various resolutions that where all native. PC monitors have historically been different and that doesn’t touch all the wide screens out there.

There has always been a large performance range between high end and low end and it’s grown every year as people hold onto older tech while companies introduce newer tech

costs the same to develop for 4k as it does 1080p. It’s not like STEAM has several folders for you to download a game from that depends on your monitor resolution.

optimization for 4k is not difficult and clearly not impossible as it’s offered on virtually every game for the PC

there is always a path, maybe its 8k or something else but there is always a path to more. We moved on from single core CPUs, video cards with 128MB of RAM, mobos that needed a wifi adapter, etc., etc.,

I also think you are confusing gaming performance requirements for settings with gaming resolution.

Jedi Survivor, Wild Hearts, Last of Us Part 1, Returnal all have optimization issues.
4k clearly take up more VRAM and can effect texture load time.

You are welcome to love 4k gaming, hope to keep this friendly discussion about opinions

None of this has to do with 4k though. Just poor management.

That’s why 4k requires a beefier video card. If you like 1080p that’s great. Be glad you enjoy the cheaper option. Is it ruining gaming? Definitely not for the people who enjoy gaming in 4k.