HotS at 4K resolution

Hi, after finishing my new build I tested several games at 4k in the case of HotS I must say I’m quite disappointed. The game looks the same as 2K with the exception of the minimap and being able to see each minion on the minimap is impressive but not really what you would expect from 4K. The game models of the heroes and the minions look better only if you scroll the camera, otherwise is almost not noticeable.

If you want to play HotS at his full you only need a GTX 1050 ti, however, in order to have a smooth experience a Ryzen 5 3600/5600 or equivalent is required. So, you should be looking for a good processor instead of a graphic card.

Have a happy new year!

4 Likes

If you want to run hots on ultra 1050ti at 2560x1440 or 4k you will have a terrible time even with a good cpu.
Please provide benchmarks results before saying it can run to its full

A 1050ti or radeon rx 560 can also watch a 8k video on youtube, it just takes 10s to render one frame

1 Like

I run hots buttery smooth in UHD with a 2600x and rx480 8g, on ultra settings. And that’s 80frames plus consistently tbc. Never drop below 70 even in ARAM with high density.

This game does not require much to run well.

4 Likes

Ummm, don’t you need a 4K monitor/screen in order to take advantage of the resolution?

Yes and it’s part of the build. 4K looks amazing in design programs BTW.

I’m honestly kinda surprised you’re surprised?

“Mass market” target games are built around 768p and 1080p.

Not sure why you would expect visual upgrades beyond effectively higher anti-aliasing.

Even on “visually spectacular games” resolution simply gives you more pixels. It isn’t higher textures and such.

More pixels does not make something look sharper unless there is higher frequency information to be shown. Without aliasing or higher resolution textures there is little to no benefit of higher resolution.

Resolution has as good as no impact on CPU workload. As such any recent performance Intel CPU from the last 7+ years will have no problems achieving over 60FPS constantly.

HotS is based on the SC2 engine which has a highly optimized graphics pipeline. Even something like the 8800 GT was able to run SC2 at higher visual settings at 1080p with reasonable frame rate for the few minutes before the GPU VRM exploded or the driver crashed. 1050ti should have no problem with very high settings at 4k while still getting more than 60 FPS on average. At most anti-aliasing would need to be sacrificed, if it is ever enabled.

This game is also CPU heavy than GPU.

No, you don’t. You can create custom resolutions with ie. Nvidia software, and then select them in-game like a supported one.

I tried it with a gtx 970 + i5 6600K with no issues, but I prefer 1080p which uses only a portion of my screen since with higher resolutions, too much important stuff happens in my peripheric vision. Basically, it is like having a "24 screen inside a "34 widescreen.

I’d rather say stable than average. Average 60fps can be 20 in a team fight, which is unplayable. Even my 6 years old 970 is better than 1050 ti and it operates with ~60% less fps on 4k compared to 1080p, and it can’t provide stable 60fps (with highest settings and antialiasing), so based on benchmark tests, 1050ti has less than stable 40 fps at 4k.

Do you really think it is and you don’t only feel it because Ryzens are less optimized for games, and even they have incredible nominal performance, they need to be more impelled for the same result? To me, HotS uses 30% CPU, and I only have 4 physical cores and zero virtual. For work, I use a Ryzen7 4800h (8 physical, 8 virtual cores), and it is a monster for sheer calculation, but it just doesn’t feel right to play on it over a much weaker Intel.

Ryzen 1000/2000/3000/4000 (4000 is in laptop/oem desktop only) had higher latency than intel so it ends up slower in games.

Ryzen 5000 mostly fixed that, and ties or beats intel’s current/past CPUs.

1 Like

HotS does mostly uses CPU as a source more than GPU does, anything higher is pointless due how the game optimization is all time low since say last year.

GPU can be seen here and there but it’s quite minimalistic little of usage.

This is why the game unironically had at some point windows XP support which are decade old PCs, which (if not upgraded) have really dumpster fire performance, but they still able to run the game on solid 30 - 60 fps due how the game is CPU intensive (on lowest quality).

3 Likes

I played Hots with a 3000Hz duo core + 2GB VGA. I only had to disable shadows for a similar result to the current one, then upgraded the VGA to a 3GB one, and it was able to deal with the highest graphic settings. I feel no difference between that old pc and the current one in terms of Hots. On the other pc at home, Hots runs with higher fps, and it excels this one only in VGA performance.

I experienced high CPU usage once when the launcher used 100% CPU and it took more than one minute to open, but it was due to a wrong Windows setting that came with a launcher update.

IIRC GPU obviously allows more higher rendering of textures, polygons and particles spawning amount. But they are really well handled to be able to be run on GPU side only because GPU are designed and meant for this, however the game also can also able to do similar tasks in CPU environment only, if you have a decent Graphic card then yeah some of the heavy loaders will move to the graphic and only opening a smaller room for CPU to be used in different places.

Boy oh boy, I started my gaming back in '94 and this all looks the same to me. Spruce everything you want however you want, I’ll be impressed when shmucks program hair that isn’t clipping, chunky, standstill…

I’m just happy I don’t live in the age of Lagnarok Offline anymore, where 15 minute bouts of debilitating lag were normal and understood.

I started playing SC in a window 98, now I’m playing HotS with a 3070 and a rizen 5 5600. It has been a long way.

It’s crazy that we’re here at HotS from a place like Warcraft, Warcraft 2 or Starcraft.

Even with shortcomings as they are, things have really come a long way.

1 Like

What you say can be true but it would look worse.

Forgive me, but I would assume a screen designed for HD can’t support 4K because the screen itself doesn’t have the pixels to support it.

I did mention without antialiasing as that hammers fillrate. Since you are running at 4k antialiasing is mostly not necessary as aliasing is naturally reduced by increased pixel density of the higher resolution. If you want antialiasing or other demanding visual effects at 4k I would recommend a more powerful GPU than an old middle-range (now “entry level”) GPU.

Modern Zen3 based Ryzen processors beat everything Intel offers currently in most games. I think Red Dead Redemption 2 was the only well known game most people tested that Intel still performs better with, and that is likely due to a serious flaw in the game code itself (not or even anti optimized for AMD CPUs) rather than Intel being faster.

Both will use only 1-2 threads anyway due to how the SC2/HotS engine works. The result is that the 8 core processor will appear to have even less utilization for similar performance than a 4 core purely because more of the cores are idle. This is why CPU utilization is not a good indicator of CPU bottlenecks. Having more cores does not mean that individual cores are slower, as Zen3 has shown where the 16 cores Ryzen 9 5950X is currently the fastest single threaded CPU for most applications.

Zen3 still has significantly worse memory latency than Intel which is why Intel still wins or at least ties in some workloads that are memory latency dependent. This is an impossible to mitigate side effect of using a chiplet design as opposed to a monolithic die as considerable time is lost moving data through the off-die chip interconnect. Intel is likely going to suffer this latency penalty soon as well when they move to using their own chiplet technology.

This is also why some of the AMD APUs (use monolithic laptop dies) can out perform their desktop counterparts in some workload given fast enough memory as they have significantly lower memory access latency.

I have not noticed any performance regressions over the last year.

This was because it was based on the SC2 engine which was designed to run on Windows XP. It would still run on the same hardware with a GPU upgrade (to be D3D11 compatible) running Windows 10 however performance is likely to be much worse.

The move to x86-64 was a large performance regression on older hardware due to the hardware being less optimized at executing such code. I noticed this with StarCraft II on my I7 920 where the game used to run perfectly at high frame rates at max settings when it first released but it was struggling to maintain playable frame rates in even coop commanders in more recent times after the move to x86-64.

What do you think the giant pool of L3 cache is helping mitigate…?

It’s the same thing as RDNA2’s L3 “infinity cache”.

If you need to go to memory less, and stay in cache… you’ve mitigated the effects of higher latency to a degree.

The odds are 32MB is just enough to really fit what many games need to mostly/fully mitigate the extra memory latency, as Zen3 has seen some massive performance jump in some games.



Why do I have the feeling this is going to be me banging my head into the wall if this discussion continues akin to how when you were saying there were no ARM cores competitive with x86 <2 years ago.

Apple’s ARM cores in the M1 aren’t much faster than their cores in the A12.