Actually they probably will support AMD eGPUs. But if WoW only supports eGPUs that will be a bust. Blizz needs to support Apple Silicon and its SoC GPU.
The latest Shadowlands alpha has strings referencing an ARM64 build and macOS 11. Seems like WoW is going to be a day-one Apple Silicon native title!
Thanks so very much for posting this! This is great news! I certainly hope that Blizzard does port World of Warcraft to Apple silicon.
What resolution were you running?
I Went from an Acer 7100 with an AMD 370 to a Mac mini with a Sonnet puck and I just am not happy with the frame rates. I kinda regret coming back to the Mac when my $1K setup canât match what I paid $399 for.
My screen 2560 x 1440 and at settings @ 5 (recommended) I am in the mid 30âs
My CPU is sitting low but my GPU is hovering @ 75-80%
The 560 isnât really cut out for that resolution. Turn OFF ambient occlusion, and turn water to fair/shadows to low and you might be able to boost it up a bit. But youâre GPU bound there.
1920x1080. Itâs a fairly old Acer. The main setting I had to play with was antialiasing; I basically went to preset 10, turned water effects down one notch, and turned antialiasing down several notches (I donât have the game in front of me, so I donât remember exactly where I left things). I wound up at around 45+ outdoors and 30+ in raids, which I consider reasonably smooth (having grown up with 24FPS motion pictures).
Come to think of it, it also worked quite nicely with a 30FPS fixed rate on a Samsung Frame, but I donât recall what resolution I set.
Reading old reviews of the iPad Pro 11" w/A12X makes me wonder what the onboard graphics for these new Macs will be like. Apple claimed on stage that the A12Xâs GPU was about as powerful that of an Xbox One S, and tests seem to back that up (as far as is feasible, given testing limitations). If a 2018 ~8W TDP tablet chip can swing that, what are they going to be able to do in an MBP or iMac?
I have doubts that AS integrated can compete with AMD/Nvidiaâs mid-high discrete offerings, but if it can start nipping at the heels of their current low-mid cards, thatâd be a pretty big deal. Even entry level Macs would have decent GPUs and it would no longer be a necessity to spend the extra $$$ for higher end configs just to have good graphics performance and longevity.
Apparently, while running under Rosetta emulation, the A12Z GPU is running pretty much equivalent to the Ryzen 5 4500U and the Core i7-1065G7. I assume it would be better with a native test.
All Apple chips so far are fantastic constructions capable operating in a tablet/phone at their max boost clocks for about 1.5 seconds before hitting thermal limits.
There is no information (not yet at least, that Shadow of the Tomb Raider demo from WWDC doesnât convince me) for what these chips can do under properly controlled and managed thermal conditions.
Seeing benchmarks is exciting, but we shouldnât place too much emphasis on the results because GeekBench (or whatever theyâre using) is not an accurate representation of how a chip will perform in a realistic workload.
Yes there is.
https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets/4
https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/4
Thereâs also AnandTechâs review of the A12X iPad Pro, where they do a bunch of testing including a Civilization VI test, which is reasonably âreal worldâ.
[quote=âChroesire-bleeding-hollow, post:52, topic:563299â]
Thereâs also [AnandTechâs review of the A12X iPad Pro][/quote]
It should be noted that the iPad Pro was running in Retina mode, which is the original lower resolution and merely doubled. In other words, it was equivalent to native resolution at 50% render scale in terms of GPU usage. And 27 FPS is hardly astounding at that resolution, especially when the MX150T can score 45 FPS.
Iâm sure it will scale up better in a desktop unit that can utilize its chassis as a heatsink, thus providing more surface area for heat to dissipate, but to call an iPad Proâs GPU an âXbox One Sâ class GPU is really stretching things. A lot. It isnât. XB1S can easily perform 1080p better than that, which means that no, the A12Z is not an XB1S class GPU since it canât even do 1080p natively at respectable framerates. Even the Intel IGP handled it better because despite being only 21 FPS, it did so at 1440p native.
Iâm sorry, but all that test showed was that the iPad Proâs GPU could play games at a minimally acceptable framerate. It couldnât handle anywhere near what we expect in modern games. And it wonât unless Apple applies serious thermal cooling to it and is able to scale it up to a desktop GPU size.
It isnât ready for prime time gaming on real AAA games yet, only iOS games. And even then it is still far behind any modern GPU. Donât expect WoW to play much better than an IGP on these things in the first gen aMacs.
Unless Iâm misreading, they found the gameâs config file and switched on full resolution rendering.
If you open the uncompressed screenshot they provide, the 3D rendered bits donât show signs of pixel doubling or other forms of scaling.
Also, iOS version Civ VI is hard-capped at 27FPS, presumably so itâs playable on dramatically less powerful iPads (like the 2014 models that are still supported). If the A12X runs it at 100% render scale with no discernible FPS difference from 50% render scale, it can likely push considerably higher framerates.
Iâm not saying to not be skeptical â skepticism is absolutely called for here. Iâm optimistic, though.
Read the resolution again in that graph. Itâs 2224x1668@2.0. Itâs pixel doubled. The rest are at 1.0, or native. All Retina resolutions are pixel doubled.
That reads to me as UI scaling, like how 15" MBPs run 1440x900 @ 2x by default.
Itâs pixel doubling, regardless. Whether itâs done via UI scaling or the OS, itâs still pixel doubling, meaning the GPU, other than the doubling function, is still only pushing 1/4th the pixels vs. the native resolution.
What? A Mac pushing 1440x900 @ 2.0 isnât pixel doubling, itâs rendering full 2880x1800 at a higher DPI.
Ah yes, DSR. Still less work than native in any case. Not impressive.
Remember though that this is two generations older than the chip thatâll actually go into the first consumer models. The A13 GPU performance was about 60% better than the A12 (Iâm not sure where the A12Z falls in there). Whatâll happen with the A14 remains to be seen, but Iâd expect some improvement over the A13.