Apple moving to their own chips

No, I meant it has been CUDA vs. OpenCL. Of course it’s Metal now. Remember, Apple did everything it could to kill OpenCL on the Mac side of things. OpenCL is still alive and kicking on the PC side though. Where CUDA has a huge advantage is in the power of nVidia’s GPUs. Apple’s got nothing on them that I can see. It’s going to be a tough sell for most pros that aren’t intrinsically married to OS X already.

Doesn’t mean it can’t happen, but it’s an uphill battle. Without solid power out of the gate, Apple has a hell of a fight to win since Rosetta 2 won’t suffice for the type of creative content producers that require every last ounce of power they can get their hands on. But I’m also not really expecting pros to really make a decision based on first gen low end transitions either. They’ll be looking to see what the replacement for the Mac Pro is. If there isn’t one, it’s likely a writeoff.

No, I meant that neither Octane nor Redshift support OpenCL on any platform.

Isn’t ‘Retina’ just Apple’s marketing term for displays that have a certain density of pixels per inch? That’s how phones and tablets can be ‘Retina’ and not have “5k” resolution.

The number of pixels doesn’t matter, it’s the density of pixels, and I suppose the expected distance your eyes will be from a given screen.

Running a 5k screen at 50% render scale in game renders the game at 1440p, but since the 5k screen is 2880p (vertically double) you get 4 physical pixels to represent what would be a single pixel on a 1440p monitor. The image is much sharper as a result.

The desktop is still 5k when you do this. however, the UI settings in the OS allow you to rescale the UI larger or smaller per the users preference. The default being an approximation of what 1440p looks like in terms of the size of UI elements. Again, it looks super sharp compared to a 1440p monitor because you have 4 physical pixels Representing what would be a single pixel on a 1440p monitor.

Yes, the GPU’s in current Macs are underpowered, for much of the line up. That should change when AMD releases their RDNA 2 GPU’s hopefully in the next few months, and Apple getting around to shipping them in their products.

Yep.

Exactly this.

No. “Retina” is the branding that involves a whole collection of technologies. Yes, high PPI is part of it, but it also includes the fact that they render the desktop at a higher resolution than what was selected, then use integer scaling to downscale it so it looks sharper than it normally would.

On retina displays, no matter what resolution you pick, even if you use an app like “EasyRes” to go beyond the resolution options in the display prefs, macOS will always render the desktop at a slightly higher resolution than what you chose and downscale. Even if I were to select my MBP’s “native” resolution of 2880x1800 (which makes the GUI unusably tiny), macOS will actually render the desktop at 3360x2100, then downscale it as a form of AA on top of scaling the UI independently of the screen resolution.

If you plug in an external monitor at 2560x1440p or lower, macOS does not enable this retina scaling mode. Anything above that, like 4k, and macOS will enable this whole suite of “Retina” features.

iOS functions the same way. It’s scaling the UI.

This tech in and of itself makes it very difficult for any game dev to make and support games on macOS. As tia has said repeatedly, no GPU in any current Mac can run games at native screen resolutions at respectable settings. Yes, they can run the desktop and low intensity apps, but the moment you run a game on a RX580 at 5k, it just makes the GPU cry. Blizzard is running the game at a LOWER resolution, then upscaling it to match the display. That’s why the game runs acceptably.

Not even the 2080 Ti or Radeon VII can run AAA games at max settings at acceptable frame rates at 4k or 5k. No matter how much you argue that your RX 580 can do it just fine, you’re not playing a AAA game at max settings. If you were, the RX 580 would be running the game at 5-10fps, max. That’s not acceptable.

That plus the aforementioned retina scaling that macOS does on high resolution displays makes it such a nightmare for game devs to bother supporting macOS.

The one exception to this is the 5k iMacs. They’re 4x1440p. Nice and easy.

I understand that the 580X in my 2019 iMac is at best a ~4 year old 1080p card. I don’t think anyone was suggesting that most of the graphics cards Apple has been shipping in their computers can come anywhere near 4/5k ultra 60 fps performance, let alone 1440p. They need RDNA 2 GPU’s to become available to have a chance at that kind of performance.

Beyond this hardware limitation, I don’t see what the Retina tech does that makes game support hard.

You and Tia have spoken about the GPU’s being a limitation, and I think everyone is in agreement there. Nobody is really expecting to run at native 5k full ultra 60+ fps with these cards.

The games have to be coded to be “retina aware”, or in “compliance” with Apple’s use of the tech. Otherwise UI elements are too tiny at high resolution because there aren’t additional assets that were created or coded for that allow those elements to be scaled properly. Depending on the number of elements that receive a backing factor, this can be a tedious affair, to say the least. Read the developer info again - Apple suggests that developers create additional assets that are higher resolution in order to “work properly” with the retina scaling tech. That’s additional work, and WoW isn’t a small game.

A game that isn’t coded to work with retina scaling will either have super tiny elements at high resolutions and/or super blurry elements at non-native (scaled) resolutions due to how Apple implemented the tech.

I remember constantly hearing what a nightmare that was for blizzard to even work out. In fact it’s still not perfect and leads to many complaints. supporting that hacky tech (and it is VERY hacky) is horrible.

Thank you both for the clarity here.

How is high resolution support of UI elements handled on Windows? Is this Just a case of Apple addressing an issue ahead of its peers, or has Apple provided an inelegant solution to the problem?

Isn’t this just a simplified version of bog standard resolution independent UI? In the past, I’ve written code in iOS apps for a handful of totally-from-scratch GPU-drawn UI elements and it wasn’t too difficult… In my case I used Core Graphics, which by default isn’t DPI aware at all and will draw things “tiny”, as Tia mentioned, but fixing that is as simple as multiplying your coordinates, sizes, etc by the scale factor provided by the OS. In other words, as long as you don’t hardcode numbers into your layout/drawing routines (which you shouldn’t be anyways), it’s almost “free”.

At least for typical desktop apps, UI scaling is easier to implement on macOS than on Windows or Linux because you’re 100% guaranteed to never have to deal with fractional scaling and the text reflowing, layout adjustments, etc that come with it. You just 2x or 3x whatever you’re drawing and you’re done, because under macOS all user selectable UI scales are either straight multipliers (e.g. 2560x1440 → 5120x1880) or macOS internally renders the screen at a resolution high enough to fall into a straight multiplier bucket when split.

If fractional scaling handling is the “hackiness” being spoken about, maybe it’s different for games specifically, but in general devs aren’t supposed to account for that at all. You’re supposed to look at the scale factor provided by the OS and stop there.

Either way, good support for resolution independence is an inevitability for devs of all types. There’s already several HiDPI Windows laptops, and within the next 5-10 years the greater bulk of desktop screens will be too. Any software burying its head in the sand and pretending that low DPI screens are all that exist will get left in the dust.

As a sidenote, WoW as it exists currently does not have dedicated HiDPI UI graphics. That might change in shadowlands (I’ve seen some extremely crisp looking buttons in alpha screenshots) but in terms of assets, WoW does nothing special for HiDPI screens.

1 Like

Apple uses a float, not an integer, for its coordinate/scaling system. There is most definitely fractional scaling occuring, especially when the HiDPI mode and/or DSR mode scalar is not exactly 2x or 4x (resolutions are even numbers, so a 3x scalar will involve a fractional (float) in nearly every case).

It does have HiDPI versions of its icons used in the UI. That’s why they’re able to scale so cleanly as seen in Stoneblades’s and Phine’s screenshots over the years. Under most circumstances 3D scaling is done via integers, not floats. The reason is fairly obvious once you look at the GPU specs and see that float ops occur at 1/16, 1/32, or 1/64th the speed of integer operations. If you’ve ever wondered why retina scaling seems to bump the CPU more than the GPU, it’s because the GPU is still only doing integer ops, while the CPU is doing the float scalars to map pixels (retina scaling is a software tech, not a hardware tech). The irony here is that if implemented via hardware, retina scaling would work better on workstation GPUs vs. consumer/prosumer GPUs.

they were required to completely kill exclusive full screen mode entirely to supports apples scaling tech. supporting both wasn’t an option and since all of apples macs use that tech now it literally forced killing off a feature that was both good for performance and for people like tia.

They were forced to change how render scale works to agian be completely different from way it was before, and not in a good way, because it had to work with apples nonsense.

they had to comply with other restrictions and changes to make daddy apple happy.

in some cases they even had to use slower code paths that came at performance cost because they were required to use certain apis. any time you are forced to use apple apis, you often have to forgo your own custom code that may have in fact been more efficient because after all apples code is universal but your own code was built specifically with YOUR app in mind not all apps.

I could go on to even more technical details but you get idea. the TL/DR of situation was it wasn’t so much allowing developers to just impliment something into their code so apples stuff worked with it, it was more of developer having to change their code to work with apple. That’s always been apples problem in fact. “our way or no way”. why do you think nvidia is now gone :smiley: and other developers aren’t as willing to deal with it.

this only gets worse as time goes on. In fact even now people complain about performance and stability issues on mac. most of those are literally apples fault but they tend to be one way street. demanding developers fall in line, but when developers have issues with apples bad code after doing so, they don’t even fix them in a prompt manor, or if they do, they don’t release that fix right away for current OS in a point release, they literally put them in next major release only. then what happens? you upgrade to next major release for those fixes only to find out it broke other stuff and thus the cycle repeats.

Don’t get me wrong, I love macOS, and apple products, but i’m not always happy with their philosophies because they are often “apple first” policies. what’s good for apple not good for consumers or developers.

I think it may be different for app developers vs. game developers. I work full time developing apps for Apple platforms and Android and I’ve had very few bad experiences with macOS/iOS, contrasted to the more “open” Android, which has been a constant thorn in my side.

For the full screen thing, didn’t the Windows version of WoW lose true fullscreen too? I thought the reason the Mac version lost it was because Blizz doesn’t have a dedicated Mac team any more and they just didn’t want to maintain the feature on either platform.

1 Like

The feature parity part is correct. The irony here is that by going to DX12 instead of Vulkan, Windows lost exclusive fullscreen support, and even though Metal does support it, getting it working with Apple’s “Retina” scaling tech is apparently just too much hassle for no real return on investment (time).

What this means is that I’m probably locked out of WoW forever at this point due to both sides doing their best to kill EFS mode. It’s doubly ironic on the Microsoft side because they advertise and even sell an accessibility controller, yet they removed proper accessibility across the board for disabled players that might not be able to fix being booted out of a game because borderless windowed (fullscreen windowed in WoW) mode does not protect against something interrupting an app’s focus.

It could be a Skype notification, Windows Defender trying to get your attention, or just some random badge appearing in the system tray - you’ll get booted out of a non-EFS game instantly. I had that happen with Star Ocean 4 in Windows. The moment the game was interrupted and knocked into regular windowed mode (which is what happens when borderless windowed mode is interrupted), the controller stopped responding and wouldn’t reconnect at all until the game was closed down and restarted. At that point only the mouse worked, which meant getting into combat was guaranteed death due to lack of controls. You think disabled folks that have a game set up for them are going to be able to fix things like that? Nope.

WoW is putting in controller support in 9.0/Shadowlands, but in yet another grand twist of irony, I don’t need that. I already had controller support working. That never stopped working. It was being booted out of the game by the Dock or some random notification or even just the OS farting at me that did the game in for me. Once I’m booted out or if I tab out of the game, the game no longer has focus, and thus my mouse acceleration curve gets broken. It doesn’t return properly when I go back into the game, so just like in Windows I have to restart WoW to get it back. It happened often enough that the game became unplayable for me since that curve was tailored specifically for my deformed arm and its lack of fine motor control.

I’ve lost out on all of BfA and will lose out on Shadowlands as well because of a singular feature being removed. Sad part is, if Blizzard had gone with Vulkan instead of DX12, at least I’d have a path to play in Windows, if not OS X. With built in controller support and Vulkan I’d have my EFS mode back, just like I get in Path of Exile, which now has a very good Vulkan renderer.

Unfortunately I can’t expect Rommax to be able to do all that on top of everything else he’s got to do. Even if I could play WoW with just a controller (once I set it up), it’d be an absolute minimalistic gameplay experience because once you factor in strafe left/right, jump, crouch (which is also swim down), left stick movement, right stick mouselook, you’re not left with a hell of a lot to use for keybinds or opening bags with, even if you’re using something like a Razer Raiju Ultimate.

Disabled gamers just really aren’t a concern with Blizzard on the whole. If they were, Overwatch would have both XInput and DirectInput support, as well as full controller customizability instead of hard-mapped controls like strafe being tied to the left analog stick and no way to remap it.

I guess disabled players’ money has cooties and Blizzard just doesn’t want to touch it. Who knows. Either way, players like me lose.

Not having to deal with intel driver bugs will be one of the best things about Apple switching to it’s own silicon.

1 Like

Yeah, like Apple was ever fast fixing AMD’s driver bugs. It wasn’t until AMD fixed the Desert Sands bug in Diablo 3 on the Windows side that a fix finally came out for Apple’s AMD GPUs. Six months later.

Apple only fixes when they feel like it, and usually, as Omegal noted, only in the next full OS revision, which usually also means half the affected hardware ends up being cut off anyway.

Their support hasn’t been stellar these last few years. I’m setting the bar pretty low just so I’m not disappointed at being disappointed.

1 Like

That’s because they’re AMD driver bugs.

Instead of having to go through Apple to AMD and waiting to AMD to get back to Apple, it’ll be straight to Apple.

Lemme know when they fix APFS to not suck on everything under the sun. Still waiting for that one.

As for the AMD driver updates, AMD had fixes ready months before Apple even bothered to include them. They were ready. The previous round of Intel driver bugs were also ready on the Intel side long ago. Blizzard ended up having to work around them because Apple refused to include the fixes.

Apple only puts out fixes quickly for hardware that’s currently selling. Even stuff that isn’t vintage yet let alone obsolete stops getting proper updates and fixes the moment the next new hardware comes out. It’s a repeating pattern with them. You honestly think it’s going to be any different with the ARM Macs? Really? Man, do I have a bridge to sell you cheap. You’ll only see super quick fixes during the transition period. Once Apple has completed the transition, they’ll just slide right back to their usual schedule of letting hardware support quietly drop by the wayside well before it should be even remotely considered obsolete. That’s why users fought for expandable tower Macs for so long. At least then they’d be able to stay in the game a lot longer, albeit at a higher overall cost over time.

I can’t believe you think Apple is actually going to change once it has everyone in proprietaryland. That’s the best laugh I’ve had in years…

APFS is great on an actual Mac SSD.
And as a side note to your rant… The AMD metal drivers are in really good shape.

1 Like