HotS at 4K resolution

Microsoft SQ CPU is the ARM CPU that powers the surface pro x. It has been able to emulate x86-x64 from the beginning, the OS lacked support for it. And it was released some year before Apple M1. Microsoft is moving slower because it can’t just drop x86 as Apple did. Also SQ is a smaller CPU with lower performance. I’d expect Microsoft to develop its own chip in the near future.

Also, windows doesn’t need the emu hardware to run x86 on ARM. I Installed windows 10 ARM64 on a snapdragon 810 phone and it can run x86 apps fine, although slower. Apple has a simpler implementation due to fewer constraints.

Where is my RTX? :face_with_monocle:

Diablo III was the game I tested the feature on. Stayed away from super sampling since then but it is nice to know that it works correctly for some games.

Everyone knew it was always possible to emulate x86-64 on ARM. The big question has always been how well, with Microsoft not doing too well with that respect as running x86 applications on their ARM powered surface was notoriously a bad experience due to the lack of performance. This is where Apple, and possibly some of their own engineering inside the M1 is special. These M1 based macs are capable of running x86-64 applications at least as fast as they natively run on comparable class Intel hardware, with the performance impact being roughly 10% or so slower than native. If Microsoft could do that with their surface from the start then x86-64 would probably already be dead.

Slower being the big issue and the area where the Apple M1 impresses the most. Yes it is still slower, but “slower” in the M1’s case is at least as fast as same class competitor x86-64 devices running the code natively. Apple has admitted that they do have hardware features to help achieve this emulation performance, and I recall them even inviting Microsoft to take advantage of these features.

NewsCenterInc was the group first multiplayer account dated to 1997 until was Bug when playing as Human Warlock in WoW. was transferred to NewsCenterAsia to conclude with Raynor rework in HotS.

Yeah, I don’t know what’s up with D3. I think it’s the font they’ve chosen. It might be confusing the downsampler.

Not really but the thing is, when you use a higher resolution than what your monitor supports, you’re basically squeezing more than 1 pixel worth of data per pixel. This gives you anti aliasing and most times; a loss of picture clarity.

1 Like

This was pretty much what I was struggling to say. Putting 4K images into an HD screen will make the image not look it’s best.

I didn’t even know HOTS can run at 4K, not that it mattered to me since I still only run an HD monitor. Running at higher resolutions when I can’t take advantage of it, is a waste.

1 Like

It will make the image look sharper than normal because it is effectively the best kind of anti-aliasing. Ideally super sampled AA like that would be the standard if not for how resource intensive it is (why common anti-aliasing uses other, less resource intensive, tricks instead).

Where it breaks is when features depend on being discrete pixels. For example in the case of Diablo III running at 4k on a 1080p display not only was the UI small and blurry but the text was nearly unreadable. This is because the text and UI remained the same pixel size but due to the super sampling was now being displayed in fewer pixels than intended. A similar effect would be achieved if the UI was incorrectly sized to begin with, with icons and textures being drawn at half scale. This is especially problematic if visual effects are based around the properties of whole pixels, for example sub-pixel rendering of text or dithering. DPI awareness can help mitigate this to some extent, but ideally the best solution is for the game to support super sampling itself since then it can control which graphic elements are drawn at super sampled resolutions and which get drawn at native resolution.

its a mod for a game built on a frame from 2004, dunno what u were expecting

You can really tell who the people are without 4k monitors.

:rofl:

Dude, I’ve got a 27"1440p display.
I had a 15.6" 4K display on a laptop before I moved to desktop.
Before that, a 13.3" laptop with a 1080p display.
Before that laptop, I had a 15.6" laptop with a 768p display.

I can tell you, from experience, 4K on HotS doesn’t improve image quality in a meaningful way over 1080p.

I choose to get a 1440p high refresh rate and an ultrawide when I moved to desktop, instead of 1 4K display with high refresh rate. And now i’ve got 1000 dollars waiting for a good HDR display. 3 years waiting now. >.<

1 Like

I’ve had multiple 1080 displays, currently still have one, have had a 1440 one, and have a 4k one. There is without question a general improvement in sharpness. You simply lack the experience and do not own a 4k monitor. Your comment about “mass market” games is total nonsense.

You realize that is what I wrote?

Anti-aliasing is the sharpness that you’re talking about. this has nothing to do with the effects and general visuals.

But the actual visuals and effects in of themselves are not developed around 4K displays.

It isn’t, and yes I realize you have no experience.

Do you know what Anti-Aliasing does, as it’s core?

Because more pixels naturally creates a less aliased image.

Do you know how the “best” form of AA works?

it renders the game at higher resolution and downsamples.

Because at higher resolution you get… LESS ALIASING.

don’t believe me?

Here is a slide from AMD. I can probably dig one up from NVidia also if needed. Actually, I was looking for Nvidia, but this one came up first when looking for Nvidia. :rofl:

I recommend you look up “variable rate shading”. It is a cutting edge feature used by modern games to rasterize various, usually lower detailed (low frequency information), parts of the scene at a lower sample rate (resolution). As long as it does not introduce aliasing or lose high frequency information the results are as good as invisible to the player irrespective of how high resolution their display is.

That is likely due to the higher sample rate reducing aliasing and allowing more high frequency (sharp) information to be seen. This will mostly apply around edges and other discontinuities, the sort that are targeted and improved by anti-aliasing. At the end of the day high resolution is the best sort of anti-aliasing.

However a low resolution texture that looks blury at 1080p will still look blury at 4k or even 8k even though the edges around it might look shaper. Using variable rate shading one might as well render parts of the scene without edges at half or even quarter sample rate and save your GPU some time.

Super sampling is an attempt to improve performance.

1 Like

It does not improve performance. Playing at 1080p with a 2x2 upscaling for super sampled AA has the same performance as playing at 4k without upscaling. Many GPU drivers literally implement this as virtual resolutions and in games they may even appear as running in resolutions like 4k, e.t.c.

1 Like

You’re thinking of DLSS. Which is a completely different form of super-sampling to SSAA.

DLSS is about upscaling an image and modifying it based on an algorithm trained at a much higher resolution.

SSAA is about literally rendering at a higher resolution and downsampling to a lower one, to reduce aliasing.

here are some resources for you:

1 Like

If you choose a too big resolution sized down for your Monitor, the interface gets quite small and it can be difficult to ping your health- or mana bar

Edit:
tried a Super sampling resolution of +25% (2400 x 1350) and the interface gets automatically scaled up. So you don’t win anything through the super sampling.