Newly Announced M1 CPU

Affinity Photo has a native build ready, and uses Metal. The lead posted benchmarks comparing the M1 (he doesn’t say which machine, although it’s implied in the thread that it’s one of the laptops) with a 2019 iMac with Radeon 580X.

I don’t know how relevant it is to our concerns, but it’s the first Metal comparison I’ve seen.

https://twitter.com/andysomerfield/status/1326866126635143169?s=21

1 Like

Early Geekbench GPU compute benchmark puts the M1 about on par with a Nvidia GeForce GTX 1060 6gb. The m1 (MBP 16gb RAM) scoring 18656 and the GTX scoring 18441. Promising for entry level machines, and the CPU scores are very impressive, which is where wow is most taxing, soooo tempted to just order one and see…

Edit: more postings, several MacBook pros are scoring 19300’s, which is in between GTX 970 and AMD Radeon RX 460

“In 1080p”.

NOT. IMPRESSED.

Comparisons to a three year old mid tier GPU do not impress me either. Playable, yes. Pleasing? No.

All depends on what your priorities are. I’ve been playing for a while with…whatever comes in the Sonnet breakaway puck, which I gather is a three-year-old middling GPU. I’ve been pretty happy with it, with settings starting at 9 and then 3-4 items turned down a bit from there, getting about 50 FPS out in the world. That’s on a 2018 Mini with an i7.

From what I’ve been reading and hearing, I should expect something roughly similar, or maybe a bit less, out of the M1. However, instead of getting that experience out of a Mini tethered to an eGPU, I’ll get it out of a 3 lb laptop with some 15-20 hours of battery life (not while playing, of course, but while doing my normal work). And in other, non-GPU respects, I’ll see a significant performance upgrade (not that I typically tax my machine).

In other words, even if a MacBook Pro with an M1 doesn’t produce GPU-specific performance that impresses you, the total package is a major quality-of-life upgrade for me. The purpose of sharing whatever information we can find about these things isn’t to impress, it’s to help people decide whether the machines might be appropriate for them.

Of course the question remains whether WoW will even launch on these machines. I hope it does, but I wouldn’t change my mind about the purchase if it doesn’t.

1 Like

It’s only better vs Intel w/integrated graphics (think Iris CPU’s). It’s still not as good as the AMD discreet GPU’s they currently use, or so I have read in the fine text on the very bottom of the Apple website, you know, the part where they list out the claims by number?

I don’t understand how can you NOT be impressed? Do you know how many years we have dealt with terrible on board GPU’s from Intel on entry level Mac’s. Not everyone can afford high specced MacBook Pro’s or iMac’s with better specced GPU’s. The Mac mini starts at $699… like holy cow. I couldn’t even run World of Warcraft login screen back then on my Mac mini without it being laggy at 15 fps.

Also Apple is a business to make money. There is no way they gonna pack the M1 chip into a higher tier MacBook Pro or iMac at around $2,000. It most likely gonna be a M2 with a much better performance. To offer the distinct difference between a college student MacBook Air and an iMac for developers and designers.

2 Likes

I’ve played WoW since Vanilla on nearly every model of Mac there is. I don’t think I’ve even checked the FPS on most of my Macs. I really don’t care what the FPS is since it runs good enough for me. Maybe it’s 20, 30, 40, 50, 60, 70, 80 or more (I think it was over 100 in Orgrimmar on my 2009 Mac Pro with an internal video card.) Maybe I’m not fussy. But then again, I also have floaters in my eyeballs and I live with that imperfection too. If people really need or want 120 FPS, then maybe don’t buy a Mac. If you are playing competitive WoW for million dollar prizes, I think Macs would be a poor choice. But for the rest of us they are great.

What I find sad, and I mean really sad, is that players even such as yourself are willing to settle for “has been” instead of demanding current levels from Apple. If I were rich, the lowest core count Mac Pro (highest IPC) w/ whatever RDNA2 card gets supported would work for me as I’d have an actual Mac solution. Unfortunately, anything that’s really worth its salt when it comes to Apple costs north of $4k more than the highest end HEDT setup I’d get on the PC side (that’s usable configuration on the Apple machine, not pure base).

I’m not unconvinced of their CPU capabilities. I know the ARM architecture can put out and then some on that front. But short of a miracle happening where Apple has a good dedicated GPU on an MCM card, or at least in an MCM configuration on the motherboard, given the die sizes Apple is likely aiming for here, I’m not expecting anything even remotely comparable to even a 1080 Ti, let alone a current gen 3080 or 3090.

I’m just tired of seeing performance that’s been in the rear view mirror for years now being hailed as something grandiose.

HDMI 2.1 displays, or hell, even DisplayPort displays that have been our for years would like a word with you. 60 FPS is perfectly fine, but it’s a real treat to be able to play a game at 120 FPS simply because it’s so smooth it glides across your screen like butter.

We used to get that level of power (relatively speaking, not actual numbers equivalent) with the cMPs and the decently powered GPUs for their time.

So I just googled how to check the FPS on my WoW client. It’s Ctrl-R. I tried it on my cheap Mac Mini. It’s 59.9 FPS in Stormwind City. Why is that not good enough? Why should those of us with entry level systems like this be treated like pariahs?

What are your settings at and with exactly what hardware? Anyone can get 60 FPS in SW when you dial down settings.

Oh you are right, it’s down to 2. Let me set it up to a higher number and see what I get. I’ll edit this post. It’s a fresh install of MacOS and WoW (under a month old). I am not sure if 2 is the default or if I lowered it for some reason. EDIT: Okay, I just set it to 5 and it’s now 30.0 FPS. I’ll try 10 next. EDIT: Okay I set it to 10 and now I get 20.0 FPS. I’m not sure which quality level I should set it to. Maybe I can turn up the features that I need and turn down the ones I don’t need. I might get about 30.0 in the end.

I just realized you’re the person with all that helpful information a while back. I would have been a little more polite if I had noticed that. I don’t want to get on the bad side of anyone who is extremely helpful.

As I say, it’s all about priorities. You seem to be laser-focused on GPU performance, and if that’s not top-tier, you’re disappointed. For me, getting anything similar to what I’m getting now in GPU performance, but also getting ridiculous battery life in an ultraportable package, is a fantastic upgrade, because GPU performance is a “nice-to-have.” (If that weren’t the case, I wouldn’t have gone with a cheap Sonnet Breakaway containing an already-somewhat-old Radeon when I made that purchase.)

The M1 is a remarkable achievement — not because it produces amazing GPU performance, because obviously it doesn’t from the perspective of someone who prioritizes that, but because of the whole package it makes possible.

3 Likes

I’m guessing you’re using the 2018 Mini? That sounds not far off from what I was seeing with the integrated graphics. It feels OK, but when you can get similar performance with much higher settings, you realize what you’ve been missing. Increased view distance with better rendering makes the game look a lot more interesting.

Tia’s not wrong that GPU performance matters, and that faster is better. You can get more out of the game with a better GPU, even just flying around at altitude looking down on the vistas. I just disagree about how important it is, relative to the other advantages the new chips bring us. If you’d told me a year ago that today, I could get a MacBook Air with identical performance to the 2019 model (which can barely play WoW at all, due to thermal issues), but with double the battery life, I’d have jumped all over it. They’ve actually done a lot better than that.

According to Random Tweet From Somebody, an ARM client is being pushed. Do you believe it? I dunno! I guess we’ll know soon!

https://twitter.com/tom_rus/status/1327293032056623105?s=21

3 Likes

TOM_RUS is legit. I can’t link anything because I dunno how the CMs would feel about it, but if you take a look at his GitHub he’s clearly very familiar with the inner workings of the game.

2 Likes

Thanks for that. Certainly, the people who have chosen to follow him lend strong legitimacy.

According to Apple VP Worldwide Product Marketing , all translated apps, such as World of Warcraft work great on M1 chip. It’s at 25:00

https://www.youtube.com/watch?v=2lK0ySxQyrs&feature=emb_title

2 Likes

Thanks for posting that! With those big shot VPs putting their reputation out there and specifically mentioning World of Warcraft by name, that’s more than enough for me.

I know these options are all entry level machines, but my girlfriend, who is a light user, just had her old laptop fail on her and she’s been transitioning to the apple ecosystem, so I think it’s a solid purchase for her. Lucky me, I get the bonus of using it until the higher end M chip replacements come out next year.

Don’t sweat it. I rarely take anything personally. I suspect you could probably do with a “5” if you turned down a few settings like clutter, environmental detail, view distance, and particle density. Those alone should get you the biggest FPS boosts without killing the looks.

The CPU side is going to be fairly beast, especially if WoW gets more improvements to its multithreading. ARM cores eat that stuff up like candy. It’s what they’re so good at. That’s what’s propping up the pathetic GPU right now. If Apple were to give those bigger die and put them into desktop power ranges, I bet they could run circles around a 9900k (10900k is slower than 9900k in core to core latency) and possibly even beat a Threadripper/Ryzen 7 with equal number of cores.

If Apple can offer an ARM mac that has a PCIe slot for a real GPU, I’ll gobble that up in an instant (when funds allow), even if the GPU isn’t quite up to nVidia’s top tier. RDNA2 looks poised to be “good enough” for 4k60 in most games while not breaking the bank like nVidia’s 3090 does. Sadly I’d still need my PC for windows gaming since most of my games require that, but I’d gladly make room on my desk for an ARM Pro with HEDT level performance.

If I remember right, he’s the one that does the diffs and posts them on github for all to see. Helps a metric crapton with LUA API changes.

I trust Apple marketing about as far as I can throw Jony Ives (who thankfully is no longer with them). I don’t think the systems are junk, but I don’t trust Apple numbers either. I’ll believe it when I see it on anandtech or arstechnica. They actually do realistic benches and show both good and bad.

1 Like

Come on, Jony Ives isn’t that big. You could probably get some distance.