Intel circling the drain

Fatality

Flawless victory

1 Like

When 9950x3D will be even worse. It will DECIMATE 14900k even in Workstation Task which is the only place left it currently Leads

1 Like

My 14900k gives me more frames in WoW than the 7800x3d did and it was cheaper. The 14900k keeps me at 120 FPS almost all the time, including in town. The 7800x3d had issues doing the same. 100 fps was the average. This is at 4k with a 120 fps cap and a 4090. I don’t have any experience with other games. Aren’t these new processors you’re talking about for AI/servers?

I don’t like or dislike either. I’m all about whatever works, achieves the goals you want and is cheaper.

**My only experience is WoW and we all know how CPU sensitive it is at higher frames. The engine just might like the 14900 more for whatever reason. It was also my first AMD CPU. Maybe there was something I didn’t do. I doubt it and shouldn’t have to do anything. I read something about messing around with the cores. Screw that. I returned it and got the build below. Applied BIOS fixes ASAP for the issues the 14th gen had.

Win 11
14900k with Corsair H150i cooler
Nvidia 4090 reference(best coolers now)
Strix z790e
Vengeance 64gb 6400 ram
Samsung 980 and 990
A big ole case so I can work in it easily. Fractal XXL I think.

https://imgur.com/a/cRwgWGL My precious.

I guarantee that isn’t the case. WoW is one of a subset of games that shows extreme benefit from the large L3 cache on the Ryzen X3D CPUs. Almost no games benefit more from the large cache than WoW.

On my AM4 system, I began with a 3900X and then upgraded to a 5900X. Then I later upgraded that 5900X to a 5800X3D. The upgrade from the 5900X to the 5800X3D was massive in WoW, and much larger than the upgrade from the 3900X to the 5900X, despite the 5900X and 5800X3D both being Zen3 CPUs and despite me actually losing 4 CPU cores by making that change.

Now I just upgraded to a 9800X3D and the performance increase was once again huge, benefiting from 2 generations of performance increases from my 5800X3D as well as the extra clock-speed enabled by the 9800X3D moving the 3D cache underneath the CPU cores.

But really, even the 5800X3D would still have been faster than the 14900K.

Here is one of Intel’s own marketing slides, from the 13900K launch:

http://73.231.25.135/Intel13seriesWoW.png

Intel, in this marketing slide, actually chose to represent the performance of the 5800X3D differently compared to the other 3 CPUs, using an awkward horizontal line for the 5800X3D performance, so as to not make their own CPUs look too terrible. You can see how even Intel admits, on their own marketing slide, that the 5800X3D is much faster than the 13900K in WoW. Now consider that the 14900K is simply a slightly tweaked re-release of the 13900K that is maybe 3% faster at most and yeah… obviously the 5800X3D is still faster, and, the 7800X3D is faster than the 5800X3D, so…

And that’s not even touching on the other issues, like the fact that the 14900K uses a TON of power and puts out a TON of heat compared to modern Ryzen CPUs. There is also the ongoing scandal where large numbers of these 13 and 14-series Intel CPUs are dying. The BIOS updates that are supposed to help with that also reduce performance, meaning that, for example, a system with the latest BIOS that was running a 13900K would actually put up LOWER numbers compared to what was seen in the marketing slide I linked (which was from long before the recent BIOS updates).

And perhaps worst of all, we don’t even know for sure that the recent BIOS updates actually fully fix the underlying issues that are/were causing the 13 and 14-series CPUs to fail. Just like Intel didn’t know the issue existed at all until after a bit of time went by, because the damage occurs over time and is cumulative. So until we have another year of data or more, we won’t know if these new BIOS updates are fully effective either. Until they have a chance to gather new long-term data, it’s just a “Cross your fingers and hope it works” fix at this point.

Yup, the 13 and 14-series parts are pretty cheap at this point, and likely only going to get cheaper - because no one wants them.

SW Jedi Survivor gets a massive increase from the added X3D cache

it’s not about cores, it’s about performance.

you also increased RAM frequency from DDR4 to DDR5.

I didn’t say that WoW was the only game that showed great benefit, only that it’s among the games that show the greatest benefit. Improvement in WoW can be as great as 60% whereas in other games the benefit can be more like 5-10%.

There are currently no games that use more than 8 cores, and WoW certainly does not, but I’ll admit that I miss the extra headroom when running multiple VMs. It was still worth it. The higher-core parts have their place for the people who need the extra cores.

The 3D cache actually greatly decreases the importance of RAM speed. I don’t feel that the DDR4 3600 that I was running was actually holding my 5800X3D back much. For example, a 5800X3D with DDR4 would still be much faster in WoW than a 9700X with DDR5 (9700X being basically a 9800X3D without the 3D cache).

It’s not really a “core thing”. After all a modern 4C/8T can run rings around an older 8c/16T CPU in gaming. It’s a performance thing of how well can your CPU perform in a heavily nested and looped game engine that dumps the majority of rendering on just a few cores while tossing out draw calls and other work. As more demanding work is needed, higher performance would be required. So it’s never “X amount of cores” is fine or enough or not enough but rather “X” amount of performance is needed or enough or not enough.

Kind of gets back into the CPU thing but the cut down in latency of all that extra cache is extremely beneficial for gaming performance. I haven’t seen any independent benchmarks with the 9700x in WoW but from what I’ve seen in other games that do benefit from X3D cache, the 9700x still outperformed the 5800X3D even in those games. The only game the two CPUs matched one another was SW Jedi Survivor mentioned above.

This game is probably the most representative of WoW performance among the games that are commonly benchmarked. The 9700X is way down the list, far below even the 5600X3D:

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN%20CPU%20Benchmark%20_%20Baldur%27s%20Gate%203%20_%201080p_Medium%20_%20GamersNexus-4x_foolhardy_Remacri_6.png.webp

Pretty much the same story here:

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN%20CPU%20Benchmark%20_%20Dragon%27s%20Dogma%202%20_%201080p_High%20_%20GamersNexus-4x_foolhardy_Remacri_3.png.webp

The 9700X does better, but the 5800X3D still wins by a respectable margin:

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN%20CPU%20Benchmark%20_%20FFXIV_%20Dawntrail%20Benchmark%20v1-4x_foolhardy_Remacri_6.png.webp

The 5800X3D doesn’t perform as well in this game compared to newer X3D CPUs, but still easily beats the 9700X:

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN%20CPU%20Benchmark%20_%20Starfield%20_%201080p_Low%20_%20GamersNexus-4x_foolhardy_Remacri_6.png.webp

Another game where the 9700X does decent, but still beat by the 5800X3D:

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN%20CPU%20Benchmark%20_%20F1%2024%20_%201080p_High%20_%20GamersNexus-4x_foolhardy_Remacri_4.png.webp
1 Like

I’ll be honest with you, I’m not the biggest fan of GN. They have been caught in the past showing performance issues other sites like Anandtech & Eurogamer could not replicate. I’m looking at their numbers for F1, BG3, and Starfield and other sites like TPU, Toms, HB don’t have the the two CPUs in the same position as GN does. They also don’t run the benchmarks at medium to high as GN does, the other sites tend to be very high to Ultra. It’s an interesting take from GN and I’ll leave it at that, I’m not doubting their numbers but find them unique.

I don’t think GN uses fresh results every time. If they use old data it might not be useful

I like HUB

If you look at the benchmark results, every single entry has a date next to it. In the links I posted, all of the results are from October 2024 (indicated as “10/24”). I’m not sure how much more transparent you can get in that respect.

There was a time when they did use archived data, they must have been called out on it previously

they have really drifted towards click bait results. I remember they posted results with I think a 8600k in a far cry game that neither DF or Anandtech were able to replicate in gaming. I know for a fact they pissed off some cooler reviewers when they announced they would start doing HSF reviews and that all other reviews were incorrect compared to their new methods. Then they entered PSU reviews and one of the top PSU reviewers posted on his blog just because someone bought a bunch of PSU review equipment doesn’t mean you know how to use it or read it.

Seven years is a long ways to come from one bad review. On the cooler side of things, they’ve only been doing that for ~4 years now, and have expanded their arsenal of tools for trying to track and measure their metrics.

I can’t speak for PSUs, though. I usually glance at the latest tier list if I’m in the market for one, but my old 1000w Coolermaster PSU is around 16 years old and still running fine in that PC.

oh it’s hardly the only one. Don’t get me wrong, when Patrick did the written case reviews they were excellent. GN then moved to a more video based format making Steve the brand similar to Linus but Linus has personalty where Steve does not. GN relies more on click bait headlines (obviously not the only one) to get views and they need to pay their bills. The click bait headlines have rubbed people the wrong way at times like when they went after TPU. End of day, people should look at several reviews to make their decisions and not just rely on one person giving you information.

I do. I use 3-4 channels for tech, and HUB and GN overlap considerably in performance reviews.

I stopped watching Linus years ago, and it wasn’t helped by how much of a d-bag he was in the Billet Labs situation. He’s just more like a Mr. Beast of tech, rather than a reviewer that dove into as much of the benching as I wanted.

1 Like

Somewhat related, I finally updated the BIOS on my 2 year old MSI Z690 MPG Edge Wifi DDR5/13700KF…didn’t experience any instability/degradation issues but I figured I may as well.

No noticeable difference except I lost about 2.5% single core and 1.5% multicore in R23.

Yay I guess

I kind of wish it would blow up so I could have an excuse to upgrade.

I think you secretly want to build an Ultra 7 system. C’mon, have faith in Intel with their forthcoming OS and BIOS fixes. /s

i’m not THAT much of a masochist

1 Like