Newly Announced M1 CPU

Clock for clock, Apple’s ARM CPUs themselves trounce Intel’s and even AMD’s by a very wide margin. The issue most of us have isn’t on the CPU side, but the GPU side. And with the latest MBPs, eGPUs are no longer officially supported, indicating Apple is going to go full-on proprietary with their GPUs and not allow the user to upgrade or use their own in the future. That might be what dissuades Blizzard from continuing with Mac support going forward after Shadowlands.

Blizzard isn’t to blame there. Apple kept Blizzard in the dark (along with everyone else) with Metal documentation until it was too late. Metal can support Overwatch now, but it couldn’t early on and by the time Apple finally got onboard, the bean counters at Blizzard said “too late, move on”. And when you combine that with having to forcefully cut off users due to the yearly abandonment of the OSes by Apple, it doesn’t paint a rosy picture. But don’t lay this on Blizzard - they didn’t ask for Apple to be the way they are.

These aren’t QualComm SoCs. They’re Apple SoCs. ARM v8 has made major strides in their IPC. And you have to remember, this is RISC, not CISC like Intel/AMD. You can accomplish more per cycle on RISC in most cases. That’s why Marvell’s ARM variants are being used in servers, especially cluster nodes. The current Marvell SoC supports up to 60 cores, with 90 cores being expected next year. ARM does scale well.

It’ll get either more performance per watt or more scalability per watt, but whether or not that translates into usable power in games is still up in the air. At desktop resolutions, suffering checkerboarding to reach 4k would be rather unimpressive IMO.

The latest MacBooks to come out do not have eGPUs listed as supported anymore. Unless this is just an accidental omission, Apple looks poised to eliminate eGPU support and force users into their proprietary setups going forward. And unless those GPUs can meet or beat their desktop counterparts, gamers are going to be fuming and game companies will take notice.

Very few actually require extensive recoding. The main issue is where games make use of SSE variants and AVX. If there’s an equivalent in the ARM architecture in use, conversion can be done, though not necessarily as trivially as would be for a game that does not make use of specialized instruction sets.

I wouldn’t trust numbers coming from the manufacturer itself. Wait for something from anandtech or arstechnica. Do not trust Bare Feats. Their testing methodologies are pathetic and they aren’t exactly friendly when confronted with lack of a standardized bench. They cherrypick so heavily that they just can’t really be trusted.

As for the 1060 comparison, not only is that two generations old, but that barely puts them in comfortable 1080p territory. They might get you 1440p native with some tradeoffs, but 4k or 5k? Not natively. That’s still going to rely on “Retina” gimmickery. Apple hasn’t included a GPU that properly handles native 4k on ultra…ever. So color me unimpressed with matching a four year old 1060 in 2020/2021.

Apple does the AMD drivers. AMD assigns devs to work with Apple in-house. nVidia refused to do that since they know their hardware best, so Apple booted them out. Classic anti-trust, but that’s Apple in a nutshell right now.

I won’t be surprised if they never work. If they are no longer on the supported list now, they aren’t coming back anytime soon. Apple doesn’t want to either be beholden to anyone else or let users have any real control anymore. Whatever Apple may say about privacy rights, they have absolutely zero qualms with screwing users over when it comes to controlling their own systems. Apple wants total control.

It has more to do with nVidia refusing, rightfully, to cede total control over their drivers. Apple doing the drivers in-house means nVidia can’t put in new hardware support without Apple’s approval. Major problem. This is all on Apple, not nVidia, otherwise we’d still be getting drivers for existing hardware at least. We aren’t. Thank Apple for that.

Did we somehow forget the massive issues with AMD’s 6750 for the iMac? That required an actual replacement program as well. Bad hardware batches aren’t limited to just nVidia. Cherrypicking that doesn’t change that fact either.

Blizzard had no real choice. They couldn’t implement new feature sets using OpenGL in macOS because Apple had long since deprecated it and was not updating it. To remain on macOS and still implement new feature sets, they had no choice but to move to Metal. And see also: Overwatch and the Metal fiasco.

It will likely require a patch to allow a compute path that doesn’t require SSE extensions that are specific to Intel/AMD chips. Currently WoW does make use of SSE and won’t even launch without a CPU that has the required SSE extension levels.

Overwatch was due to timing, Apple’s refusal to give Blizzard documentation and assistance up front early on, and political in how it was all handled. Blizzard got burned pretty good here. Diablo 4 we just don’t know about yet. All we do know is that Mac support is…well, nobody knows.

WOW! Your depth and breadth of knowledge is impressive. However I didn’t follow your logic that lack of support for eGPUs might dissuade Blizzard from supporting Mac at all. Besides, this is just the first round of Apple Silicon Macs and we aren’t sure that all future Macs will not support eGPUs.

Why should a lack of eGPU support influence Blizzard since hardly any Mac users use eGPUs?

Apple generally doesn’t remove support for something and then bring it back. That’s almost never happened. They might be forced to bring it back if it turns out their GPUs are utter dung, so who knows. But odds aren’t exactly in favor of that returning if it isn’t present in this batch of hardware. They’ve had plenty of time to enable eGPU support for their SoCs.

There’s also a possibility that AMD drivers simply haven’t been ported yet and were lower-priority than getting Big Sur out the door, since the first M1 Macs aren’t built with discrete GPUs and eGPU users are somewhat niche. I guess time will tell though.

1 Like

Except that Apple hasn’t “removed” eGPU support for M1 Macs. They just haven’t “added” it yet. It’s a new computer. It’s like saying that the new M1 Macs have “removed” support for three monitors on M1 Macs. They just haven’t added that feature yet.

1 Like

I wasn’t, AMD took some of the fiscal load. Nvidia refused to even acknowledge there was an issue and basically told Apple “Sue us”.

I’ve read the architecture breakdown. I personally don’t buy it. Keeping a CPU that wide fed will be a challenge at any reasonable clock speed. But we’ll see.

My guess was actually that they didn’t want to deal with that part of the thunderbolt spec in this low of a rev honestly. This is the first time they’ve shipped a thunderbolt controller they built in house. It might be capable of doing it, it might not. I’m not even sure these chips support PCI-E at all technically. The block diagram I’ve seen doesn’t have it. Apple claims it exists on the chip in the form of PCI-E 4, but… they may have only put in enough lanes for the NVMe and called it done. Note that none of the models shipped support 10Gb Ethernet, something that would have required a 4x connection.

1 Like

It wouldn’t surprise me at all if that were the case. The first Intel Macs had their own weirdnesses like that.

They’re going to have to add in those features when/if they get around to the Mac Pro + iMac Pro though, and arguably even the normal 27" iMac. There’s also that rumored “half size Mac Pro” that would probably have at least a single PCI-E slot.

Call me cynical… but I think it’s more if than when Apple has largely given up on the workstation market. Most of the folks that the MacPro used to satisfy have long since moved on to other workstations. The current MacPros were made with specific customers in mind. I expect any new MacPro will be the same… focused solely on Protools and FinalCut users.

That certainly possible, though the iMac Pro is aimed more at mid-high-end prosumers and devs wanting something a bit better than a standard iMac than it is hardcore Protools and Final Cut users, which I think is evidence that they haven’t given up on that segment just yet.

1 Like

yeah, i’ll believe it when i see it.

Mine’ll arrive first week in December. I suppose the smart thing to do is make sure WoW runs on it before I buy Shadowlands…

1 Like

I think someone explained that WoW uses some Intel instructions that currently are not supported in Mac’s emulation program, so that means it won’t run without some work by Blizzard. But apart from a few Apple-bashers, most people think that Blizzard will make it work eventually.

10 GbE only requires an x2 link at PCIe 3.0 spec, and can operate at a nice x1 link on PCIe 4.0. Remember, 10 Gbit/sec = 1.25 GB/sec, well within the overhead limitations of an x2 link.

Looks like benchmarks are starting to show. I am wondering how it will run on an 8GB vs 16GB model.

I assume the game will need 16 for the best performance.

Macrumors has a comparison. I can’t post a link.

https://browser.geekbench.com/v5/cpu/search?utf8=✓&q=Apple+Silicon

A good starting point.

2 Likes

There’s a video here of Baldur’s Gate 3 running on “ultra”. But perhaps of more interest is a ways down in the comments, where people discuss the demands of BG3 vs WoW.

https://www.reddit.com/r/macgaming/comments/jshr85/baldurs_gate_3_running_on_apple_m1/

Mine is arriving with 16gb memory on December 4th.

I’m in that thread!

Apple do a fantastic job at their presentations of showcasing a game which runs awfully, meanwhile speaking about it as though it’s revolutionary.

When ARM-based Macs were announced, they showed Shadow of the Tomb Raider running at 1080p with what appeared to be low settings. At this weeks event, we briefly saw Beyond a Steel Sky running badly, and also Baldurs Gate 3 at 1080p and apparently Ultra settings.

All clearly running at sub-30fps and even in their highly edited presentation, incorrect frame pacing and stutter were obvious.

air or mini? please post findings :smiley: