Apple moving to their own chips

Actually they probably will support AMD eGPUs. But if WoW only supports eGPUs that will be a bust. Blizz needs to support Apple Silicon and its SoC GPU.

The latest Shadowlands alpha has strings referencing an ARM64 build and macOS 11. Seems like WoW is going to be a day-one Apple Silicon native title!

9 Likes

Thanks so very much for posting this! This is great news! I certainly hope that Blizzard does port World of Warcraft to Apple silicon.

What resolution were you running?

I Went from an Acer 7100 with an AMD 370 to a Mac mini with a Sonnet puck and I just am not happy with the frame rates. I kinda regret coming back to the Mac when my $1K setup can’t match what I paid $399 for.

My screen 2560 x 1440 and at settings @ 5 (recommended) I am in the mid 30’s

My CPU is sitting low but my GPU is hovering @ 75-80%

The 560 isn’t really cut out for that resolution. Turn OFF ambient occlusion, and turn water to fair/shadows to low and you might be able to boost it up a bit. But you’re GPU bound there.

1 Like

1920x1080. It’s a fairly old Acer. The main setting I had to play with was antialiasing; I basically went to preset 10, turned water effects down one notch, and turned antialiasing down several notches (I don’t have the game in front of me, so I don’t remember exactly where I left things). I wound up at around 45+ outdoors and 30+ in raids, which I consider reasonably smooth (having grown up with 24FPS motion pictures).

1 Like

Come to think of it, it also worked quite nicely with a 30FPS fixed rate on a Samsung Frame, but I don’t recall what resolution I set.

Reading old reviews of the iPad Pro 11" w/A12X makes me wonder what the onboard graphics for these new Macs will be like. Apple claimed on stage that the A12X’s GPU was about as powerful that of an Xbox One S, and tests seem to back that up (as far as is feasible, given testing limitations). If a 2018 ~8W TDP tablet chip can swing that, what are they going to be able to do in an MBP or iMac?

I have doubts that AS integrated can compete with AMD/Nvidia’s mid-high discrete offerings, but if it can start nipping at the heels of their current low-mid cards, that’d be a pretty big deal. Even entry level Macs would have decent GPUs and it would no longer be a necessity to spend the extra $$$ for higher end configs just to have good graphics performance and longevity.

1 Like

Apparently, while running under Rosetta emulation, the A12Z GPU is running pretty much equivalent to the Ryzen 5 4500U and the Core i7-1065G7. I assume it would be better with a native test.

All Apple chips so far are fantastic constructions capable operating in a tablet/phone at their max boost clocks for about 1.5 seconds before hitting thermal limits.

There is no information (not yet at least, that Shadow of the Tomb Raider demo from WWDC doesn’t convince me) for what these chips can do under properly controlled and managed thermal conditions.

Seeing benchmarks is exciting, but we shouldn’t place too much emphasis on the results because GeekBench (or whatever they’re using) is not an accurate representation of how a chip will perform in a realistic workload.

Yes there is.

https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets/4

https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/4

There’s also AnandTech’s review of the A12X iPad Pro, where they do a bunch of testing including a Civilization VI test, which is reasonably “real world”.

[quote=“Chroesire-bleeding-hollow, post:52, topic:563299”]
There’s also [AnandTech’s review of the A12X iPad Pro][/quote]

It should be noted that the iPad Pro was running in Retina mode, which is the original lower resolution and merely doubled. In other words, it was equivalent to native resolution at 50% render scale in terms of GPU usage. And 27 FPS is hardly astounding at that resolution, especially when the MX150T can score 45 FPS.

I’m sure it will scale up better in a desktop unit that can utilize its chassis as a heatsink, thus providing more surface area for heat to dissipate, but to call an iPad Pro’s GPU an “Xbox One S” class GPU is really stretching things. A lot. It isn’t. XB1S can easily perform 1080p better than that, which means that no, the A12Z is not an XB1S class GPU since it can’t even do 1080p natively at respectable framerates. Even the Intel IGP handled it better because despite being only 21 FPS, it did so at 1440p native.

I’m sorry, but all that test showed was that the iPad Pro’s GPU could play games at a minimally acceptable framerate. It couldn’t handle anywhere near what we expect in modern games. And it won’t unless Apple applies serious thermal cooling to it and is able to scale it up to a desktop GPU size.

It isn’t ready for prime time gaming on real AAA games yet, only iOS games. And even then it is still far behind any modern GPU. Don’t expect WoW to play much better than an IGP on these things in the first gen aMacs.

1 Like

Unless I’m misreading, they found the game’s config file and switched on full resolution rendering.

If you open the uncompressed screenshot they provide, the 3D rendered bits don’t show signs of pixel doubling or other forms of scaling.

Also, iOS version Civ VI is hard-capped at 27FPS, presumably so it’s playable on dramatically less powerful iPads (like the 2014 models that are still supported). If the A12X runs it at 100% render scale with no discernible FPS difference from 50% render scale, it can likely push considerably higher framerates.

I’m not saying to not be skeptical — skepticism is absolutely called for here. I’m optimistic, though.

Read the resolution again in that graph. It’s 2224x1668@2.0. It’s pixel doubled. The rest are at 1.0, or native. All Retina resolutions are pixel doubled.

That reads to me as UI scaling, like how 15" MBPs run 1440x900 @ 2x by default.

It’s pixel doubling, regardless. Whether it’s done via UI scaling or the OS, it’s still pixel doubling, meaning the GPU, other than the doubling function, is still only pushing 1/4th the pixels vs. the native resolution.

What? A Mac pushing 1440x900 @ 2.0 isn’t pixel doubling, it’s rendering full 2880x1800 at a higher DPI.

Ah yes, DSR. Still less work than native in any case. Not impressive.

Remember though that this is two generations older than the chip that’ll actually go into the first consumer models. The A13 GPU performance was about 60% better than the A12 (I’m not sure where the A12Z falls in there). What’ll happen with the A14 remains to be seen, but I’d expect some improvement over the A13.