Does wow Benefit from the extra cache in 7800x3d?

Building a new pc for Christmas, coming from a 6700k oc’d.

I’ve heard the 7800x3d is a god chip essentially but it really depends if the game benefits from it’s cache?

1 Like

Oh my yes. I went from a 3900X to a 5800X3D and nearly doubled my framerate in Valdrakken at the time (detail 10, RT on, 1440p, with a 3080 10GB). Now, not all of that is going to be the “fault” of the extra cache: that’s going to a newer generation chip, losing the cross-CCD penalty, and a clock speed bump of around 300MHz (a little under 10%), but even those changes all taken together are only really going to account for 20% or so.

You’re right, though, in that it depends on the title itself as to how much difference it’s going to make. It’s never worse, but WoW is an outlier for how much improvement it experiences. It tends to have the most benefit in the situations where it matters most, though, in that minimums improve more than maximums.

3 Likes

WoW benefits from the 3D cache more than almost any other game. Most games experience uplifts in the 10-15% range, but a few, like MS Flight Simulator, Factorio, and WoW, often see gains upwards of 60%+.

If WoW is the primary game that you care about in terms of performance, then you should not consider any other CPU. Even the oldest X3D CPU, the 5800X3D, is still faster than ANY Intel CPU that currently exists, in WoW.

2 Likes

World of Warcraft benefits from the cache more than pretty much any other game out there. It is a CPU dependent game. CPU dependent games rely on two things. Cpu+ram speed. However with the heavy cache makes the ram speed kind of unimportant. That’s why it’s so good for Budget gamers because they don’t need super fast expensive RAM

In Most Games other than WoW you wont notice much of a difference because you are GPU bottlenecked in most titles. Unless you are on a 4090 then you will 100% see a difference. This is why reviewers use the 4090 for their CPU Benchmarks to remove as much of the GPU bottleneck as they can.

Normal Games
GPU>Ram>CPU (CPU is the Last thing that can bottleneck you)
WoW, other MMOs and CPU Dependent Games
CPU>Ram>GPU (CPU is the First thing that can bottleneck you)

My own Testing
I had the luxury of having both platforms for a hot min. Since I did both builds myself they were identical except for the CPU. Intel has the ROG Strix z790 Gaming-E and amd had the ROG Strix x670e Gaming-E motherboards. Other than the CPU and mobo everything else was the same even the same exact GPU and PSU, RAM and Case

Both toons in Valdraken side by side at the same time. Both were using the same 7400mhz g.skill RAM kits. However on the 7950x3D I set it to 6000mhz so it would run at 1:1. On intel it was 7200mhz

7950x3D + RTX 4090 (ram running DOCP 6000mhz)

13900k + RTX 4090 (Ram runninx XMP 7200mhz)

FYI when I dialed the Ram speed down to 6000mhz on the 13900k build? The frame rate went much lower than what you see in that image what you see in that image was 7200mhz Ram. Basically a non-X3d chip you need super fast Ram to be comparable in a CPU based game.

With all of this said the main place you will notice the difference between a x3D cpu and non X3D is basically Major City Hubs, raids world events places where you have a lot of players in the same place at the same time that’s when world Warcraft really put your CPU to its limits. This is what made Valdraken on Area 52 such a good place to benchmark

5 Likes

Thank you all for your replies, I’ve just read that 9800x3d is releasing Nov 7th and so I will wait for reviews on that and make my decision from there as to which of the two cpus I go with!

1 Like

No choice to make anymore sadly… 7800x3D is even faster than the i9 285k by a lot. Intel dropped the ball this time around and is focusing more on Desktop Workstations not gaming

You choices will be 7800x3D or 9800x3D or Intels Last Gen 14900k (which I wouldnt recommend. AMD has an upgrade path)

1 Like

60+, even with shifty’s benchmarks he saw less than half of that.

Good? 100%, 78003DX a budget CPU? Don’t think it falls into that category.

1 Like

freaking blizzard constantly changing my profile toon.

there are other factors as well.
What is your monitor resolution and refresh rate?
What is your GPU? How often do you upgrade your GPU?
How long do you keep your CPU?

The 9800X3D will be the top dog gaming CPU but that doesn’t mean you will necessarily see a difference from that and a 7800X3D (or even 9700X) if other factors are holding back performance.

1 Like

The ultra series is so bad it is often a regression compared to 12th gen

It’s a complete joke

1 Like

Perhaps not budget, but definitely a price break for top end gaming.

The X3D chips generally have lower frequency and overclocking headroom, and historically the lower core count chips have outperformed the higher X3D variants, meaning gamers don’t have to buy the very very best for the best performance.

The 7800X3D is a superior gaming chip to the 7900X3D or 7950X3D, possibly because of ccx latency or just scheduling.

But in either case gamers were looking to spend about $350 (before supply dipped) for the best gaming CPU available, which, when compared to Intel offerings, is i7 range. The i9s in recent years are $500-600, and the 7800X3D vastly outperforms the i9 when it matters and is at worst comparable.

So maybe not “budget”, but by all means they don’t have to spend top dollar for the best of the best gaming, and they were still spending less than the competitor and beating them handily.

If you factor in the 6 core variants then you really end up in budget range, which makes me wish these chips were more widely available.

mass PC sellers will like it as will certain workstations as it does excel in certain areas while floundering in others. I can’t recall a CPU line that was so inconsistent in performance. In AI, Rendering, Database programs it can offer top notch performance. In MS office, gaming, web benchmarks it’s 12th gen performance as times like you said. It does have an improved iGPU where the AMD APUs just lacked good CPU performance and its better power efficiency means cheaper coolers for the mass sellers. For everyone else, there are better options.

it’s fantastic for top end gaming value

too much demand and better binning than the Phenom II days of old.

I agree with basically everything Shiftydruid has said on this topic, but a single screenshot while standing in a city hardly constitutes a “benchmark”.

Also, he was comparing Intel vs AMD. I was simply talking about the benefit of 3D cache vs not having 3D cache.

Here are some benchmark numbers. They are comparing a 5900X vs a 5800X3D (Zen 3), but the benefit from 3D cache should be very much the same with Zen 4 (7800X3D, etc) and Zen 5 (9800X3D, etc).

http://73.231.25.135/5800x3d_compared.png

If the OP is really on a budget, they could get a 5700X3D and pair it with a $50 B450 motherboard, and still have a setup that is faster than ANY Intel CPU in WoW.

I wonder if the loss of hyperthreading is part of it

That might explain some of the performance issues with extremely multi-threaded workloads like Cinebench, but it certainly wouldn’t explain the bad performance in games.

the easy answer is yes but let’s not forget it’s a more advanced foundry node and more P core cache. Would Intel be able to hit the same frequency if they included HT? How much power would the CPU demand? What kind of cooler would you need to dissipate that heat?

It might, some games can definitely use up more than 6 or even 8 cores; we saw this manifest in models such as the older i5-8600k/9600k and i7-9700k.

The application of E-cores may not really be appropriate for gaming compared to old hyperthreading.

Willing to bet the games that struggle with the new chips are having issues with trying to use E-cores when conventionally they’d get better performance with hyperthreading only to fall flat on its face when windows scheduler doesn’t know what to do with the E-cores.

we also saw those CPUs do better than the Intel 6700/7700k & 8700k in certain games even though they had less threads and they often had almost identical performance in gaming suites (7700k & 8600K, 8700k & 9700K). It’s not a core thing, it’s not a HT thing, it’s a performance thing. X3D does an excellent job getting information from the cache to the core in apps that can take advantage of that fast transfer while Ultra seems to struggle with it. I honestly hope Intel fixes it the issues (mostly likely in future lines) as I do like having iGPU and lack of competition doesn’t benefit anyone.

I remember when looking at old data, an equally clocked 8700k always outperformed the 8600k, whether it was from the increase in cache (probably) or hyperthreading depended on the game.

Regarding 9700k outperforming an 8700k, that’s harder to gauge. Threads aren’t more cores, it’s just efficiency. In games where 6 cores in an absolute term isn’t enough to give great performance, 8 actual cores is going to do better, hyperthreading or not. But like for like, HT on with same count count usually resulted in better performance when applicable.

And like in the tests on the new chips show, not all games suffer tremendously. Something is going on in those games, and I think it’s going to end up being hyperthreading.

Another issue testes ran into was microstutter on the non-HT CPUs. Many games would hit high/max utilization in 8600k (and later 9700k) resulting in stutters whereas they would not have the same issues on the 8700k or 9900k.

It became less of an issue on the 10900K because it had so many cores to start with, and many people disabled hyperthreading to that end.

Hyperthreading itself is just an efficiency feature, and while it does itself incur a minor overhead in pure single threaded performance, I can see how in some games it can benefit and there were a decent amount of games that were able to utilize this feature to improve performance.

In theory those e-core should help with latency, in practice from the few tests I’ve seen they do on previous generations. With Ultra? Maybe they do, maybe they don’t and it is a scheduler thing. I’m sure someone can test it out.