Performance Optimization with Apple Studio Display

Hello!

I’ve been playing WoW comfortably on my Macbook M3 Pro, keeping consistent 60fps on my 15" screen…but I recently upgraded to a Apple Studio Display, which ended up being much bigger than expected and I’ve now been struggling to find the proper settings for the game to run smoothly on this 5k resolution screen. I feel like I’ve bitten off more than I can chew.

If anyone has any recommendation for settings with the ASD, please help me out! :grinning:

you basically will need to use FSR for 5k. for that display I suggest 50% setting cause then it’ll use 1440p base resolution upscaled to 5k

In graphics settings under advanced set it to fidelity super resolution and set resolution scale to 50%

2 Likes

Thanks for the response! I’ll try this out.

Are there any other game settings you’d recommend lowering for a smooth experience? What can I leave higher so the game looks decent?

I run a Studio Display on an M3 Max MacBook Pro. It runs quite well once you have the right settings. Here are some settings I use.

For all macOS users:

  • Vertical Sync: Enabled
  • Anti-Aliasing: FXAA High (you could also try None to improve performance)
  • Start by settings Base Game Quality: 7. Then adjust the following below.
  • Liquid: Fair (very important for all macOS users)
  • SSAO: Good
  • Compute: Good (or less) (very important for macOS users)
  • Spell Density: Most
  • View Distance: 7
  • Environment Detail: 4
  • Ground clutter: 10 (I use 10 but you may adjust this for non-Max chips)

Important Settings for Studio Display:

  • Resolution: 5120x2880
  • Render scale: 75% (for non-Max chip users you may want lower than this)
  • Resample Quality: Bilinear (I prefer this as it performs better and looks almost as good, however you may try FidelityFX)
  • Max Foreground FPS: 60 FPS (the reason for this is there is a bug where if you don’t have this set, after looking around you get a stutter. this fixes that stutter.)
  • Max Background FPS: 8 FPS (when you are browsing the web this helps a bit)
1 Like

FXAA will make the output awful. You want FSR, not FXAA.

FidelityFX is FSR, which is what >1440p users should be using. It produces far, far better imagery than anything FXAA does.

FXAA looks good at 5k.
FSR makes things pixelated to me, i prefer Bilinear.

If you’re running 75% render scale, you’re already running 4k native and shouldn’t need AA of any kind, least of all FXAA, which has the worst image quality of all the AA mechanisms. All FXAA does is blur textures and is the most lossy of the lot. But whatever floats your boat I guess.

Of course I have compared the image quality with FXAA on and off, and I prefer it. Things are different once you have a 5k monitor. On a 1440p monitor, i agree, FXAA blurs things quite badly, at 5k it looks great.

i imagine CMAA 2 would look much better at comparable performance. there is a reason almost no games even use or support FXAA anymore, it’s just really bad compared to TAA or CMAA and both have good performance compared to expensive things like SSAA and MSAA

in fact I would almost bet using 66% render scale with FSR + CMAA 2 would produce better performance with similar image quality. especially since when using FSR you can adjust the sharpening algorithm to be less or more strong

but at end of day as long as you’re happy. I’m definitely way more fussy at video clarity than you are though, i imagine same for tia, we’re both very picky with video and audio quality

The problem with CMAA2 is that at 75% scaling I start dipping below 60 FPS and get stuttering. However it runs fine at 66% scaling and FSR as you suggested. However, I still prefer 75% and FXAA because there is much less shimmering at edges. shrug To each their own I suppose. Thanks for the tips anyways.

Yeah FSR 1 introduces shimmering and sadly the game isn’t any closer to supporting FSR2 or FSR 3 which handle that MUCH better.

another thing is that liquid detail on apple silicon hits too hard, especially at higher resolutions. setting that to fair or good at highest, gives way more wiggle room for performance as well. I was told the way that’s coded is just not compatible with tile based rendering and the screen space reflections really decimate performance on M1 and higher, way more so than even lower end AMD gpus.

1 Like

That’s perfectly reasonable. Everyone has their triggers when it comes to imagery. Blurring is something I can’t handle well at all because of my eyesight. I need sharpness so I can pick out details that blurred imagery would hide. I’m more tolerant of shimmering than blurring as a result. One of the weirder aspects of this obnoxioius body of mine is that I actually prefer AA turned off. Probably because I grew up with blocky games and played them on emulators for so long that my brain just automatically AAs the images itself. Sure saves on GPU resources. :slight_smile: