OH, for temp and noise. The Mac Studio is sitting on my desk and I can’t hear it – at all. CPUs are running at 58c average and the GPUs at 64c average.
Thank you very much !
It seems like the Mac Studio M1 Max with 32 core upgrade for $200 is the best bang for your buck. The Ultra just doesn’t scale as much. You are paying double the price, but only getting 20% gains.
Agreed, the 32 core is probably the best bang for your buck.
MaxTech just posted a video on YouTube where they compared the 32-core Max and the 64-core Ultra in WoW.
The performance gain is significantly higher than 20%, BUT keep in mind they “tested” in a starter zone with no players around, 67% resolution scale and 3413x1920 with settings not optimized. While they tested on a Studio Display, they had to use this weird resolution because their M1 Max was a MacBook Pro.
Most people using the Studio Display (or any 5k monitor) will likely use a different render scale with fidelity super resolution turned on.
Besides, a Studio Display (and most 5k screens) are only 60 Hz so there’s not much point in getting that Ultra for 180 FPS anyway. Unless you use a high Hz gaming monitor. But those are natively 1440p in most cases, which the 32-core Max handles well enough.
Long story short: is the Ultra worth all that extra money? I don’t think so, the 32-core Max will be fine in most cases if you optimize the settings.
64-Core GPU version is clearly performing way better than 32-core. I am deciding between 32-Core and 48-core and it seems that there is little difference which isn’t worth that amount of money (here in EU 32-core Max is actually 360 $ more expensive than in US)
I am replacing 5k iMac so I am ordering studio display as well which means that I am limited to 60 Hz (still can use my older 100 Hz display) and for Hz like this I feel like Max is more than enough.
I have seen people playing at decent graphic settings on Pro chip so I am probably going for 32-core Max.
That would definitely wildly swing it toward gpu. It goes back to what i said. it DOES have 2x the gpu power. the problem is it does NOT have 2x the cpu power. it has 2x the cores that are literally adding NOTHING to wow because it barely uses the core an M1 has let alone a max or ultra. As such, the instant you are in a more demanding area, in a fight, or have lots of players around, those gpu cores you have sit idle waiting for cpu to catch up.
so unless you’re only going to be playing classic full time and plan on pushing a 240hz 4k screen in these much lower demand areas of game, you’re not gonna see the benefit of those gpus. At least not enough to justify adding 2k to your purchase.
Now if you are getting ultra for more than gaming, such as anything professional like editing or design or compute, you instantly gain far more value out of ultra over the max…
It’s just really important not to set expectations too high for gaming, because it’s not just wow, most games won’t be able to keep 64 gpu cores busy when literally anything is going on on a rendering pipeline that’s likely not verey multi threaded and barely even using 2 of those 20 cores you have on cpu. WoW is probably one of the most multi threaded mac games out there and it still probably only really efficiently uses 2 cores as primary and then a few sub tasks sent to other cores (up to about 8) but not nearly enough is split from the two primary threads yet.
Now, if wow ever finds time to further multi thread. such as getting Ui running on own thread and not same thread as rendering, and other improvements, then more cpu bottlenecks can be removed and get more work tasked to more gpu cores…but there is no telling when any of that will happen.
Thanks for the awesome feedback on this topic. I do just want to follow up on @Valdomir claims. Although yes MaxTech did in fact test those settings in the starter zone at the MackBook Pro settings at the start of the video, they did in fact turn the resolution up to 5k towards the end of the WoW test just to see what FPS they could hit. They did in fact hit around 120 FPS with everything maxed out which is definitely nice. I personally am using a 2020 27" 5k iMac with the 8 core CPU, 5700 XT 16 VRAM GPU, 128 RAM, 2 TB SSD, and when I went to go do this exact same test in the exact same spot in Durotar I got 45 FPS with everything maxed out so honestly it is a nice improvement overall. The only downside to all of this is the $$$. For the amount of money you would need to spend for a Mac Studio along with a monitor, you might be better off just buying a gaming PC and buying a base model Mac Studio to satisfy both needs if your workflow depends on you having a Mac. Personally I just like Mac OS so much better than windows and Im willing to lose money or take a hit etc instead of buying a gaming pc and going that route but my personal choice is definitely not cost effective. Just wanted to add my 2 cents on this.
That’s interesting, I had no idea it worked that way on the CPU side of things. Now perhaps a silly question, and pure theoretical: suppose you’re rendering something and whilst waiting for that, you play some WoW. Or do something else in the background (or on another monitor) that uses extra CPU power while you’re in WoW. Does that mean those extra cores will actually be used on the Max or Ultra? Or does it not work like that.
Aye, I’m in the same situation, more or less, looking to replace a 2017 iMac 5K (i7 4.2 Ghz and Radeon 580 Pro). While it does run WoW alright (60 FPS, settings on 7, 50% render scale, super fidelity on) it’s noisy and runs hot all the time. Not to mention the power it’s drawing. That Studio seems a nice improvement.
I considered the base M1 when it came out but I’m glad I waited. On the GFXBench website the base M1 GPU metal benchmarks (1440p gaming) were equal or slightly lower than those of my iMac 5K.
Same here. I’ve been using Macs since the 90s (Performa 630!) and even though I quit professional photo editing a while ago I cba to switch to Windows. Or buy an extra PC just for gaming since WoW is literally the only game I occasionally play.
If you have something that can scale to many cores then yes, they will be used. But only specific workloads scale that optimally. Still if you use multiple apps at once then each will need some resources so based on what you want to do on your PC you should pick the specs that match it.
Simple synthetic benchmarks are not good measurement of actual gaming performance. Still the base M! has performance similar to Vega 8 and G7 Xe integrated graphics in gaming while higher in content creation.
I just received my Max Studio with 32 graphic cores and I’d love it if someone could share the optimum settings for this configuration.
Probably depends on display. Generally speaking if display is 4k or better, resolution scale 50%, advnced settings and fidelity super resolution selected for the upscale technique to still have a sharp and clear image.
from there, probably turn compute effects off. For sampling you’ll probably want to use CMAA for low cost or MSAA 4x for even clearer/sharper image, especially if you aren’t shooting for above 60fps I’d use MSAA 4x until CMAA2 is improved and more stable (it exists in client but is kinda crashy and over demanding at moment) so only accessible via cvar
Would you please share how does it perform on your Mac ?
W00T another Performa user!
I had the Performa 640 (630 with a 486dx266 daughter card). Wasn’t my first Apple, but still one of my favorites.
On the recommended 8 setting with high textures, dynamic spell density, ultra shadows, good liquid detail, ultra particle density, ultra SSAO, high depth effects, disabled compute effects and high outline mode, I’m getting a steady 60 fps in most areas, though certainly not in parts of Oribos.
I should also mention this is on the new Studio Display at 2928 x 1902.
ugh! making me feel like I need that Alienware 21:9 for some lower res to get some better numbers D: was gonna go with that combo.
Thanks. Are you going to play at this resolution and these settings or lower it to maybe 1440p ? I think you can achieve much higher fps at1440p
Thanks for sharing. Couple of questions/suggestions:
You may get better performance with render scale on 50% (2560 x 1440) and fidelity super resolution on (you won’t really see the difference), in my experience this scales the best for 5K screens.
I assume FPS is capped at 60 anyway because of the Studio Display. So while these numbers seem low, people probably get more FPS on a higher hz display with the right cable.
How does it perform in Ardenweald? And Elysian Hold?
Do you think you can try to play at 2567x1440 just for a while to see what kind of fps do you have in Zereth Mortis/Oribos ? Would be super helpful ^^
there is almost no reason to actually lower resolution, instead of render scale with fidelity super resolution. there is almost no reason to play native resolution either. you have output at native and render scale at 1080p or 1440p so you are getting visual quality of native at the render cost of only 1080p/1440p. that’s where you get those smooth 4-5k at 120fps performance.
For what it’s worth, and I realize it’s a tiny slice of WoW players, for my 71-year-old eyes the Retina 5K at full, native resolution makes the game experience for me — much more important than FPS. The better resolution and contrast makes the experience much more enjoyable.
If the 32-or-higher core M1 Max/Ultra Studio + Studio Display experience adds FPS to the same (or slightly better) graphics I would definitely check my bank balance to see if I could afford an Ultra-based machine. Not for current performance, but for anytime in the next 3 - 5 years, when Intel and everyone else will be following Apple’s lead and producing consumer chipsets with similar characteristics. Then and only then would I expect more threads to be added by Blizz. But since I buy machines with future as well as current performance in mind, it’s an interesting if pricey gamble.