Would you bother upgrading from a 12700k to 13700k?

Would you do it or would it not really be worth it?

misc: z690 mobo, 32gb ddr4, 6700 XT gpu, game at 1080p

I’m also kinda looking at the 14700k.

On another note, I’m wondering if I’d be ok staying with air cooling if I upgrade. I use a Noctua NH-U12S cooler and have lots of Noctua case fans and don’t have any heating issues with my oc’d 12700k. I’d really rather not upgrade my cooling if I can help it.


EDIT – switching platforms and going with the 7800x3d…

1 Like

Do you play a lot of games besides WoW? If, not, even the older AMD 5800X3D paired with a $50 motherboard will beat the 14700k and 14900k; and you could re-use your existing DDR4.

WoW really loves that massive L3 cache on the AMD X3D chips and nothing Intel has at the moment compares. Also keep in mind that most benchmarks for the latest Intel platforms are using DDR5, so your performance will be hobbled in comparison due to using DDR4. The 5800X3D isn’t limited by the DDR4 as much due to the massive L3 cache acting as a buffer that get used before the RAM does.

The Intel chips also have horrible power consumption and heat output compared to the recent generation AMD chips. No significant heatsink, VRM, or power-supply concerns with a chip like the 5800X3D or 7800X3D.

But if you are absolutely married to that platform… i’d go straight for the 14700k or 14900k.

2 Likes

As a 5800X3D owner myself, I strongly agree with most of what you said. Only caveat is with considering the 14th generation. It’s very difficult to recommend the 14700K or 14900K unless they’re practically identical in price to their predecessors. Intel’s 14th “gen” (if you even want to call it that :P) is one of the most disappointing generations in a long, long time.

If your main focus is playing games - especially games like WoW - then an X3D chip can prove to be extremely good value. As mentioned, the 5800X3D in particular would let you reuse your existing DDR4 memory, and cooling would be much easier given the low power draw in comparison. Anything that could cool your 12700K would effectively be overkill for a 5800X3D or 7800X3D.

1 Like

With only very rare exceptions, WoW is the only game I play on my PC. That said, multi-tasking performance is also a consideration as I keep literally 12+ tabs open in Chrome at all times and frequently bounce between the game and my tabs.

Thanks so much for your input and do continue to share whatever you think. I appreciate it!

 

Good info!

I’m not married to intel so am happy to switch it up to amd if that’s the smartest choice. I’m just thinking of different things though, like how will the 5800x3d handle my bouncing back and forth frequently between wow and the 12+ chrome tabs I always have open. Also, not having onboard graphics makes my anxiety wince a bit.

I’m thinking too about the 7800x3d, but that’ll necessitate purchasing new ram if I’m not mistaken in addition to a more expensive board.

Hmmm… thinking thinking… :thinking:

Not that I expect you guys to do all my research for me but if I were to splurge and go ahead with a 7800x3d, what are some reasonably solid budget choices for compatible mobo and ram, and how much ram would be really advised for an amd system where I do a lot of multitasking?

Thanks for the input.

Yeah, that’s a complete non-issue. 8 cores is still plenty of cores. It wasn’t that long ago when almost all mainstream CPUs were Quad core. WoW itself will almost never use more than 3-4 cores at any given time, so even with an 8-core CPU, you will still have half or more of your CPU available to cover any background tasks while playing WoW. And of course, the CPUs have SMT, so those 8 physical cores show up as 16 virtual cores, giving it even more capability when performing multiple tasks.

I’m running a 5800X3D at the moment. I run a setup with 6 monitors and I multi-task like crazy. I use Firefox, not Chrome, but right now I have 57 firefox tabs open as well as several separate windows and it’s a non-issue. Most modern browsers will put inactive tabs to sleep, so running lots of tabs is more of a RAM issue than a CPU issue and 32GB is plenty. Even when putting all of my side monitors to work while gaming, I’ve never felt that it was holding WoW back. And since one of my monitors is dedicated just to monitoring computer performance, it’s definitely something I keep an eye on.

I’m not sure why not having onboard video would be a problem. I actually run two videocards in my system. My RTX 4080 can only support four monitors, so I have a small GTX 750ti installed also just to handle the remaining two monitors.

Keep in mind that as far as AMD CPUs go, there is also the 7900X3D and 7950X3D, but although these CPUs have more cores (12 and 16 respectively), only half of them have 3D cache. You’re at the mercy of software to assign the correct tasks to the X3D cores. These basically have two 6 or 8 core CCDs that communicate with each other across the infinity fabric. Cores from each CCD having to communicate with each other across the infinity fabric also introduces a latency penalty. The 7800X3D on the other hand only has one CCD, a single 8-core CCD with only X3D cores, so you will always be using an X3D core and the cores will never be communicating with each other over the infinity fabric. Those two reasons are a big part of why the 7800X3D is still considered the #1 choice for gaming when choosing an AMD CPU despite “only” having 8 cores.

It’s also worth pointing out that all of Intel’s higher-end chips (such as the 13700k, 13900k, 14700k, & 14900k), NONE have more than 8 performance cores. The additional cores that they have beyond 8 are slow “efficiency” cores. That creates it’s own scheduling issues, making sure that games get assigned to the P cores and not the E cores. Many people go so far as to simply disable the E cores completely and simply run them as 8 core CPUs just to avoid the mess. So yeah… obviously 8 cores is plenty.

1 Like

It isn’t, until it is.

1 Like

What I meant was, I’m assuming that you were talking about potentially having a backup in case your main GPU fails or something. If that happens, the onboard video isn’t going to realistically be a useful backup for gaming. There are other videocards that can be bought for $10 on eBay that can serve as a backup just to boot the computer into Windows. Or, i’d assume that you already have extra old cards in your spare parts bin if you are building your own computers.

1 Like

There are multiple reasons why it’s desirable to have a chip with onboard graphics, the most obvious being that in the event your dedicated gpu konks out and for whatever reason you happen to not have another gpu in the closet that you can grab, then you will still be able to use your computer until you get another dedicated card.

There are times I’ll have a couple extra cards in my closet, and there are times when I do not. Things happen. Having onboard graphics is protection against my own potential carelessness as it ensures that in the event of gpu failure I won’t have to experience an ‘oh darn, I was sure I had another gpu in the closet. Crap!’ moment.

I also like firing up a new build with only the cpu first and getting things set up as is, before adding in the dedicated card. It’s a preference.

It can also make troubleshooting easier in some instances.

I have absolutely no thought that onboard graphics would ever be a suitable replacement for a dedicated gpu, so that isn’t anything I have in mind at all.

I’ve had F chips before. It was never a problem per se to go that route, but it isn’t ideal and is not preferred. Would I go with a chip again that doesn’t have onboard graphics? Absolutely, if I felt good about the choice for whatever reason. It just isn’t my preference.

No, it’s not worth it.

1 Like

Yeah I think you’re right. I’ve come to the same conclusion. When I made this thread I was contemplating possibly just upgrading my cpu, and now I have parts ordered for an entire platform switch from intel to amd.

The X3D CPUs do better in WoW for some reason. Maybe it’s optimization issues or other. But they really do perform better in WoW.

1 Like

In case you still look at this thread, the general recommendation for RAM pairing for the 7800X3D I’ve seen is 6000 mhz with CL30 timing. (I can’t remember all of the numbers, but you can probably google it.)

I’m pretty loyal to Corsair on RAM, but it doesn’t really matter what you buy. If something like G.skill is on sale you can go with it.

1 Like

I ended up going with these G.Skill sticks.

https://www.amazon.com/gp/product/B0BF8FVLSL/

I’m not necessarily into the RGB stuff, but the sticks seem good and have good reviews.

WoW (and a lot of other games) really like the extra L3 cache on the x3d CPUs.

Probably has to do with repeatedly checking a fixed spot in memory, so having more cache available allows the CPU to keep more of that area of memory (if not the whole thing) on die so it doesn’t have to wait for the memory controller or the bus to get the data it’s looking for, but this is just my theory.

1 Like

Intel’s lineup has been mind numbingly terrible since like the 3770K came out. They just started re-re-re-re-re-releasing crap because AMD’s FX series was crap. When AMD started making real CPUs again, Intel never picked up the pace and still hasn’t. They just keep stacking on cores so they look sorta equivalent from a distance. They use insecure shortcuts to look OK-ish benchmarks on launch week then immediately patch them when the review cycle ends. Their current solution to competing is just turning the CPUs up to 500W, horrible company.

1 Like

Yikes… are they really that bad?

Intel is very much behind AMD technologically for consumer desktop CPUs.

They are still monolithic dies, and they have to blast power to keep performance up with AMD (and even Apple in some cases)

1 Like

I can remember Intel generally always having a reputation for producing the top chips, year after year, decade after decade. It was only really when the Ryzens started coming out that AMD seemed to finally be taken truly seriously. Prior to the Ryzens, AMD was only really the ‘if you can’t afford intel’ choice. Times change I guess.

Yep.

2 examples of how big the efficiency gap between Intel and pretty much everyone else (except maybe NVIDIA) is that the 11700k takes 43 watts to get the same single core performance as the apple M2, which would use less than 5 watts single core, and in gaming they need to blast about 200 watts on the 14900k to almost keep up with the 65 watts the 7800x3d uses for the same game (in the case of these numbers: cyberpunk)

1 Like

Why is that exactly? Why is intel so inefficient with their chips now? That kind of difference just seems really, really big. I don’t need a super technical breakdown, just some sort of general understanding would be cool though. Maybe you can explain it in common enough terms that even a Swarf can understand.