Clock for clock, Apple’s ARM CPUs themselves trounce Intel’s and even AMD’s by a very wide margin. The issue most of us have isn’t on the CPU side, but the GPU side. And with the latest MBPs, eGPUs are no longer officially supported, indicating Apple is going to go full-on proprietary with their GPUs and not allow the user to upgrade or use their own in the future. That might be what dissuades Blizzard from continuing with Mac support going forward after Shadowlands.
Blizzard isn’t to blame there. Apple kept Blizzard in the dark (along with everyone else) with Metal documentation until it was too late. Metal can support Overwatch now, but it couldn’t early on and by the time Apple finally got onboard, the bean counters at Blizzard said “too late, move on”. And when you combine that with having to forcefully cut off users due to the yearly abandonment of the OSes by Apple, it doesn’t paint a rosy picture. But don’t lay this on Blizzard - they didn’t ask for Apple to be the way they are.
These aren’t QualComm SoCs. They’re Apple SoCs. ARM v8 has made major strides in their IPC. And you have to remember, this is RISC, not CISC like Intel/AMD. You can accomplish more per cycle on RISC in most cases. That’s why Marvell’s ARM variants are being used in servers, especially cluster nodes. The current Marvell SoC supports up to 60 cores, with 90 cores being expected next year. ARM does scale well.
It’ll get either more performance per watt or more scalability per watt, but whether or not that translates into usable power in games is still up in the air. At desktop resolutions, suffering checkerboarding to reach 4k would be rather unimpressive IMO.
The latest MacBooks to come out do not have eGPUs listed as supported anymore. Unless this is just an accidental omission, Apple looks poised to eliminate eGPU support and force users into their proprietary setups going forward. And unless those GPUs can meet or beat their desktop counterparts, gamers are going to be fuming and game companies will take notice.
Very few actually require extensive recoding. The main issue is where games make use of SSE variants and AVX. If there’s an equivalent in the ARM architecture in use, conversion can be done, though not necessarily as trivially as would be for a game that does not make use of specialized instruction sets.
I wouldn’t trust numbers coming from the manufacturer itself. Wait for something from anandtech or arstechnica. Do not trust Bare Feats. Their testing methodologies are pathetic and they aren’t exactly friendly when confronted with lack of a standardized bench. They cherrypick so heavily that they just can’t really be trusted.
As for the 1060 comparison, not only is that two generations old, but that barely puts them in comfortable 1080p territory. They might get you 1440p native with some tradeoffs, but 4k or 5k? Not natively. That’s still going to rely on “Retina” gimmickery. Apple hasn’t included a GPU that properly handles native 4k on ultra…ever. So color me unimpressed with matching a four year old 1060 in 2020/2021.
Apple does the AMD drivers. AMD assigns devs to work with Apple in-house. nVidia refused to do that since they know their hardware best, so Apple booted them out. Classic anti-trust, but that’s Apple in a nutshell right now.
I won’t be surprised if they never work. If they are no longer on the supported list now, they aren’t coming back anytime soon. Apple doesn’t want to either be beholden to anyone else or let users have any real control anymore. Whatever Apple may say about privacy rights, they have absolutely zero qualms with screwing users over when it comes to controlling their own systems. Apple wants total control.
It has more to do with nVidia refusing, rightfully, to cede total control over their drivers. Apple doing the drivers in-house means nVidia can’t put in new hardware support without Apple’s approval. Major problem. This is all on Apple, not nVidia, otherwise we’d still be getting drivers for existing hardware at least. We aren’t. Thank Apple for that.
Did we somehow forget the massive issues with AMD’s 6750 for the iMac? That required an actual replacement program as well. Bad hardware batches aren’t limited to just nVidia. Cherrypicking that doesn’t change that fact either.
Blizzard had no real choice. They couldn’t implement new feature sets using OpenGL in macOS because Apple had long since deprecated it and was not updating it. To remain on macOS and still implement new feature sets, they had no choice but to move to Metal. And see also: Overwatch and the Metal fiasco.
It will likely require a patch to allow a compute path that doesn’t require SSE extensions that are specific to Intel/AMD chips. Currently WoW does make use of SSE and won’t even launch without a CPU that has the required SSE extension levels.
Overwatch was due to timing, Apple’s refusal to give Blizzard documentation and assistance up front early on, and political in how it was all handled. Blizzard got burned pretty good here. Diablo 4 we just don’t know about yet. All we do know is that Mac support is…well, nobody knows.