Overwatch 2 on Mac? The new Macbooks are fire!

It could. The game isn’t natively optimised for the PS5 yet.

Even the iPad can run a game like Fortnite at 120 FPS. I have no doubt that the M1 Max chip could do 200 FPS.

Do you have proof of such a statement or are you just saying random stuff?

120fps is actually a pretty significant difference. It’s also unlikely that the ipad is running the equivalent of desktop PC’s ultra settings.

1 Like

Take the PC equivalent to a PS5. A 3060 super and Ryzen 4600H. That should get you 200 FPS. Maybe not on ultra but at the very least on medium settings.

I don’t need proof when we can just make estimations given the hardware.

Lol yes you do. Considering the series X has a “performance mode” that is limited to 120fps and requires lowering graphics quality, saying that the PS5, a slower device, can maintain 200fps on ultra, is just making stuff up.

You can’t just blindly compare a console and a PC that has what seems like similar specs. It’s really not the same at all.

2 Likes

Where are you getting this from? The series X caps at 120 FPS because it’s designed to run with TVs.

The series X can be connected to any display device, including monitors that support VRR. There would be literally no reason to artificially cap it at 120hz if you’re on a display that can go higher. Given the tradeoffs of the other available settings (e.g. 4K@60hz) theres very little reason to believe that it could easily maintain a much higher framerate than that and that they’re capping it only because “it’s designed to run with TVs”

You can’t just make random guesses about performance by comparing to other devices.

That doesn’t mean it can output to 144 Hz or above. It’s designed to output at 120 Hz. I suggest you look into it. Are there even any games that run at 144 Hz?

If you were to connect the Xbox series x to a 144 Hz monitor the games would still run at 120 hz/fps.

https://www.google.com/amp/s/amp.reddit.com/r/XboxSeriesX/comments/lzhkpu/does_the_series_x_support_144hz_monitors/

If they dont then it’s likely an xbox requirement for consistency, and likely chosen because they know the limits of their hardware. But there’s no reason to assume that it can handle 200fps on the equivalent of PC ultra settings.

My point is you can’t just compare a console to arbitrary PC hardware to try and find an equivalent, and then convert that to terflops to compare against Apple’s chip. And you can’t just compare all of that to “ipad can do 120hz in fortnite.”

Don’t mistake what I’m saying - the M1/M1 Max are incredible chips. I would go as far as to say I’m a bit jealous of the efficiency. But companies fudge numbers and cherry pick results to make things seem better than they are. Apple is comparing the M1 max to an RTX 3080 mobile chip at nearly 1/3 the power. Apple has some really smart engineers for sure, but that’s just so insanely unlikely that in just a few years they’ve completely toppled the performance and efficiency of a company that’s specialized in GPU hardware for decades. I’m going to need some real world tests before I believe such a claim.

1 Like

There are already plenty of real-world tests for the 2020 M1 chip. It lives up to the claims for the most part. So I see no reason for the M1 Max claims to be overly exaggerated.

This whole discussion is a bit silly because does it really matter if it can’t do 200 FPS at ultra settings? The point of this thread is just to ask blizzard to release the game on Mac. The new MacBook Pros only have up to 120 Hz displays anyway. If they can reach that then that should be fantastic.

200 FPS is by no means far away at this point given the rate of progress with these chips.

While it should be able to run ow2 and many other games they don’t support it and it’s exactly like porting a game from one platform to another, which is a lot of effort from a dead start, another reason would be the PC gaming community is probably like 95% windows users

1 Like

Some of the M1 performance was certainly exaggerated, and done using synthetic benchmarks. Sorry but as good as the M1 Max chip is, I will remain highly skeptical until I see some real results.

Well yes, because the chain of replies I replied to was about this statement:

So it was kind of the key point I was making.

1 Like

They aren’t passing on those savings. At most places (BB, Adorama) they are MSRP at $1,299 and on Amazon they are $1,450+. Again you are saying there are sales but the vast majority of people are not going to see that. Heck, it looks like some people will be paying more than MSRP right now.

Most companies do not pass along savings to customers. Apple has shareholders which means unless they are forced to, they will take increased profits over passing on savings.

Probably shouldn’t mention that the M1 thermal throttles really bad…

androidauthority com/apple-m1-test-benchmark-performance-thermal-1185988/

The M1 slightly looses to Intel 8 core CPUs for multi-threading and Intel is slower at the same core count than AMD so by extension, an 8 core AMD CPU would beat the M1 let alone the 12 core. Mind you this is best case scenario for the M1 where code is designed specifically for the new M1. Anything running through their translation layer is going to be slower.

Even without looking at any other windows laptop, the same dell latitude 9510 I mentioned above weights 3.1 pounds where as the Apple weighs 3.00 pounds. Mind you the Dell laptop has 2 inches more of screen space, it would be lighter than the Apple if it had the same 13-inch screen size. It’s not like Apple has a monopoly on Aluminum and there are plenty of lighter windows based laptops. There’s nothing Apple has that gives them the unique ability to make laptops lighter then everyone else.

No. I can go and buy a used MacBook pro 2019 model for $499 including a 1TB SDD and 16GB of RAM. That’s about a $500 drop in value per year.

How long a laptop lasts, windows or mac, entirely depends on how well the laptop has aged. Saying people don’t buy used windows laptops is ridiculous. A simply look at eBay will yield millions of sold used windows laptops.

You aren’t understanding what unified means. In the case of the M1, Apple decided to put both the M1 die and memory on a silicon interposer. All unified means is that all components of the M1 die can access the entire allowance of RAM. This is pretty common on smartphones. You still only have 8GB of memory between the CPU and GPU, which isn’t good given that most games minimum spec for 8GB of RAM and at least a 4GB GPU, meaning the total memory requirements are 12GB baseline for a majority of AAA titles. It’s completely unacceptable for the vast majority of modern games and professional applications. Heck even for office work 8GB is not ideal if you are multi-tasking. Most gaming laptops will come equipped with 16GB of main system memory and an additional 6GB or more just on the GPU. In regards to RAM speed, the M1 peaks at 68.5 GB/s effective memory bandwidth and memory latency is poor above 8K. Bandwidth is good but latency is bad.

MAC users are different, that’s why they bought a MAC.

I would have to disagree. Have you seen the people buying 3K graphics cards or $1,200 monitors? The PC just allows you to choose how much you spend, I definitely would say there are options to burn your money if you wish.

They come with a 65 watt charger which would indicate otherwise. Pretty standard for a mid-range laptop. As the link above shows, 65w on a fanless design means compromises.

Teraflops aren’t a measure of actual performance. AMD’s 200 and 300, Fury, and Vega GPUs all had higher TFlops than Nvidia’s products and Nvidia’s products were faster.

The PS5 absolutely smokes the M1 in performance, it’s not even remotely close. M1 competes with integrated graphics and that’s it. Might compete with my old 750 Ti as well but that’s not a compliment, suffice to say it’s a fraction of the power the PS5 has.

This really depends. I can get 200 FPS on my super old 750 Ti in Overwatch that’s over 7 years old now. Saying “it can reach x FPS” doesn’t really mean much without context.

Yes, I’ve seen the Geekbench benchmark 1000 time now but it hasn’t been considered a valid benchmark in the PC space for a long time. No professional reviewer uses geekbench to measure performance on a PC.

It actually benefits Apple to get as many people as possible onto these new chips to ease the transition from x86 to arm by effectively having a larger pool of testers. This is a crucial period for them.

Not in the MacBook Pro. And barely just slightly in the MacBook Air. Which doesn’t have a cooling fan. I have yet to even hear the fan come on on my MacBook Pro M1 though.

Those AMD and Intel CPUs need a fan to sustain anywhere remotely close to the performance of M1. The laptops that don’t have a fan get significantly worse performance.

Go ahead and link me that because I doubt it.

I actually sold my old MacBook for more than I bought it two years later!

You might be right about that although the M1 Max has up to 64GB RAM which can also be utilised as VRAM. This is unprecedented and unmatched even by any current dedicated GPU set up.The M1 Max does 400 GB/s.

Nope. I’m just an ordinary windows user who recently switched to mac because I love the design and build quality.

UK Amazon where I bought mine from they’ve been at a 10% discount for a while. You can get them even cheaper than that a lot of the time.

I doubt that statement was meant to be taken as literally as you took it. OP is just saying that comparable performance to dedicated graphics cards is now a reality. Obviously we can’t know for certain any specific benchmark numbers until the game is actually ported and we test it.

And yet they are charging $1,300.

The MacBook Pro has a fan, so that’s no longer passively cooled. The MacBook air problems are outlined in the article I linked.

False.

ebay com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313&_nkw=macbook+pro+2019&_sacat=0

When your video card runs out of memory any additional data is stored on main system memory for windows machines. What you describe isn’t unprecedented, it’s essentially what modern operating systems have been doing for a long time now. In fact if you have an AMD Vega GPU, those can create a virtual memory pool of up to 1 TB thanks to their HBM Cache controller.

In addition, there is a major disadvantage to only having one unified memory pool. Graphics cards typically use GDDR, which is a high bandwidth version of DDR. Apple is using LPDDR4X (a low power version of DDR4) whereas Nvidia uses GDDR6X, which trounces DDR4 bandwidth wise. Windows laptops use DDR for main system memory as it’s latency optimized and GDDR for the graphics cards as it’s bandwidth optimized and thus each component gets memory best optimized for the use case. By using DDR4 for everything, you choke bandwidth intensive workloads. The bandwidth of mid-range Nvidia GPUs is 450 GB/s.

FYI the M1 is rated at 200 GB/s and the Max at 400 GB/s but that’s shared across the entire CPU. As we’ve see from prior benchmarks, we only saw 68.5 GB/s in benchmarks. The above quoted numbers for the Nvidia mobile GPU is for the GPU only and is the amount you will actually get.

Didn’t you just say you sold your old MacBook?

£1,167.00 down from £1,299.00 for the MacBook Pro
https://www.amazon.co.uk/Apple-MacBook-Chip-13-inch-256GB/dp/B08N5MWPY6
£899.97 down from £999.00 for the MacBook Air
https://www.amazon.co.uk/Apple-MacBook-Chip-13-inch-256GB/dp/B08N5N1WBH

There is actually a trick to significantly reduce the thermal throttling.

The laptops that don’t have a fan from Intel and AMD do indeed perform significantly worse than M1.

That’s a 2010 model. So in 11 years you still get to recoup $500!

Main system memory isn’t as fast.

Yes 64 GB of VRAM is unprecedented.

Furthermore the SSD speeds are extremely fast and might also be able to be utilised. >7GB/s read speeds.

Might be true.

Yes but I was running Windows on it.

Those prices are even more expensive then the US prices, 1167 pounds is over $1,600 USD. Insane…

It’s also going to cost you $70 for a specialized toolkit to open Apple’s not made to repair devices as mentioned in the video plus a thermal pad. You do void your warranty as well. 99.9% of customers will not do this nor is it recommendable for the customer to fix a problem created by the manufacturer, especially when said manufacturer makes it hard to fix in the first place.

false

/facepalm

No, it is not. You can see right in the URL it clearly states macbook pro 2019.

  1. Main system memory isn’t as fast as what? As I pointed out above, Apple is using main system memory as in LPDDR4, the same stuff used in windows laptops. Again DDR is latency suited whereas GDDR is bandwidth suited.

  2. Again, it is NOT VRAM. VRAM would indicate that it’s dedicated to the GPU which it is not nor is the memory even GDDR. Apple M1 laptops have 0GB of VRAM, it’s all main system memory. That main system memory is shared between the CPU and GPU exactly how a windows laptop with CPU and iGPU share main system memory as well. You wouldn’t classify that main system memory VRAM simply because the iGPU happens to use it and it’s even more ridiculous to suggest what is clearly DDR is GDDR. The bandwidth of Apple’s memory is vastly lower than what you get from even low end graphic card memory interfaces. That makes sense given the M1’s GPU is peanuts compared to dedicated GPUs.

  3. The SSD gets 3.3 GB/s, which is average for an NVME SSD.
    macrumors com/roundup/macbook-pro-13/#ssd

AMD Zen 3 supports PCIe 4.0 NVME which supports speeds up to 7.6 GB/s. Not that you need that on a laptop but you seem to have a tendency to greatly exaggerate for no real reason.

1 Like

I’ve checked a price tracker website for Amazon.

Even on Amazon US it’s at least $100 off, and has gone as low as $1099.

https://camelcamelcamel.com/product/B08N5N6RSS

If you get the MacBook Pro you won’t get throttling because of the fans. But even with the fans they are barely audible at all. I never heard it even turn on once after 1-hour of playing World of Warcraft on my MacBook Pro M1.

Not really.

It literally says 2010 model. The MacOS version is from 2019 which is what you’re getting confused with.

In the new one it uses LPDDR5-6400 6200mhz RAM.

So no, not quite.

That’s the old one. Not the new one.

Which would mean high prices if you wanted to buy SSDs that fast, taking the overall cost into Apple territory.

That’s the 2020 model, not 2021.

Yes and the same could be said for many windows laptops. Your original point was that you didn’t need fans and for that I pointed out windows laptops that don’t as well. Now if your point is that it does need a fan but it’s quiet, there are thousands of windows laptops that are just that as well.

“It”? I gave you a search result page, not a singular product page. Not singular, plural.

That’s still main system memory, it’s just the low power version of the new DDR5 standard.

I’m not sure if you are aware but early DDR5 is all going to have loose subtimings as new memory standards always have for the first 3 years. There are LPDDR5 and DDR5 modules on the market and they have high frequency but they are about the same speed as a good DDR4 kit (which is what Apple has already.)

anandtech com/show/17019/apple-announced-m1-pro-m1-max-giant-new-socs-with-allout-performance

The bandwidth on the max is getting a bump from the increase in bus width, not from the new memory. Mind you it’s still less bandwidth than low-end discrete GPUs and it’s shared between all the M1’s components. Again though, it doesn’t need to be. It’s not supposed to be a gaming monster.

Unless you can provide benchmarks proving otherwise, every site has that number as the newest one.

No, the price of the SSD alone doesn’t increase the cost of the system by 3 times what it should cost. You can get PCIe 4.0 windows laptops at $400 and up. In fact you can upgrade any Intel 11th gen PC to a PCIe 4.0 SSD

phisonblog com/how-to-upgrade-a-tiger-lake-notebook-with-a-pcie-4-0-ssd/

You do know the PS5 has PCIe 4.0 too right and a good graphics card and that’s only $500 right? You seem to think Apples are worth it component wise but they aren’t, you don’t buy an Apple device for the performance.