Newly Announced M1 CPU

I don’t write games so things might a bit different there, but I work as a dev myself and for normal programs, x86-specific instructions that are required for operation are far and few between. Those are almost always optimizations that can be disabled with a compiler switch.

My guess is that the initial ARM builds of WoW will run, but will lack optimizations, with those coming along later down the road.

1 Like

my understanding is the way games make use of the x86 instruction sets makes it difficult to use something else without rewriting alot of the base game engine code. Maybe im wrong but thats my understanding.

if it was easy to convert games to arm why havent many games already been converted for the nvidia shield and other arm devices that have considerable power?

this all takes into account this chip really is as fast as apple claims and we both know manufacturers lie to make their products sound good.

Shield, and non-freemium Android gaming as a whole, is super niche. A better example is the Nintendo Switch which is ARM-based and has hardware almost identical to the Shield’s, which has received no shortage of ports for games where the Switch’s limited horsepower is adequate.

1 Like

yeah but the difference is with the shield its all nvidia this apple one is not. nvidia provides actual support for game development i doubt apple will.

and most of the games that have been ported were not from x86 anyway but from PowerPC.

I highly doubt they are using intrinsics directly anymore. The issue is more that x86 lets you get away with things you can’t on other platforms because of how the memory model works. This is particularly noticeable on multi-threaded code.

There’s not much in the way of support for Apple to provide. I watched the WWDC sessions related to ARM based Macs, and developing games for them is barely different from developing games for x86 Macs. Apple GPUs have a couple extra switches to toggle but that’s the extent of it — as long as you’re not leaning on arch-specific instructions most existing Mac games will work by just ticking the “ARM” box in Xcode.

3 Likes

thats way beyond my basic understanding. but i do remember hearing devs praising the original xbox because of the memory system handling alot of things in the background that other architectures did not. But like i said this is way beyond my understanding.

Well the original Xbox was basically a Pentium III PC with some legacy garbage removed.

developing a game with an engine already running on arm is not the same as porting an engine to run on a totally different architecture.

more or less it was a pentium 3 coppermine iirc. with an nvidia geforce 3.

Going to respectfully disagree based on my knowledge of how many people use hacky code and old C++ codebases. Waaaaay too many of them rely on undefined behavior that just happens to work on x86 because of the strong memory ordering. That won’t fly on weakly ordered systems like ARM. It’s fixable… but it will mean frustrating debugging finding all of them for awhile.

That’s unfortunate if true. Sounds like something that should clean itself up as the industry moves to Rust.

1 Like

I like your optimism but the industry isn’t moving anywhere quickly, the cost to RRIR is immense and in almost all cases not worth paying. Even then Rust won’t always save you more than a few bugs in rust code gen right now are fixed by “using the trunk compiler” which won’t fly in game dev where you need a stable system or in many certified for use enterprise apps.

Doesn’t look like it. eGPUs are not found on the supported accessory lists for these new computers.

In that case, the Pro is probably my better bet.

CPU and GPU doesn’t seem to change between any of the models based on what I’m seeing. But honestly I’d be surprised if the AMD eGPUs didn’t work… albeit eventually.

I will take that bet.

Apple doesn’t allow custom kext files which is how Nvidia used to write drivers for macOS. AMD won’t be able to write them either.

So much of this is Apple controlling their destiny. They knew their chips would beat Intel chips, and they were tired of Intel failing to deliver on their roadmap. They likely feel the same about AMD: why wait for them to develop a decent GPU when we can just do it all in one SoC?

2 Likes

Apple is free to do what they like, but AFAIK they still sell AMD GPUs as a higher performance option. Thus far they’ve not shown anything in that class. The GPU of the M1 competes with Intel, not AMD GPUs. But we’ll see. It’s one thing to design a CPU it’s another to license the insane array of patents for GPUs. Right now IIRC they are licensed with PowerVR, which isn’t designed for the same sorts of things.

But honestly? Given what I’ve seen apple doesn’t take GPUs seriously IMO.

Rumour is that they’re developing an ARM Mac Pro, so I think we’ll need to wait until to see how they intend to handle proper high performance graphics.

The systems announced today are definitely all of the buzzwords Apple have described them as, but they’re still only a Mac Mini, Macbook Air, and 13" MacBook Pro. An Air is never going to hit the same performance levels of an iMac, no matter how much irrelevant marketing info Apple push on us (“16-core Neural Engine!”)

Yeah, it’s a two-year transition. Today’s Macs are the consumer stuff: The Air, and the Base Pro that is really an Air Plus.

That said, I fully expect the 16" to be all Mx with like 16 CPU cores and 16 GPU cores or something. I wouldn’t be surprised if both Pros have them and the screen and battery life are the big differences between the 13" (which I think will be a 14") and the 16".