I was going off the idea that the x3d to x3d would be a similar uplift as the x to x.
But if there’s a substantially bigger improvement, we’ll see.
So, in the Gamers Nexus AMD lab tour, they talked about how this was an idea they tested for a theoretical 5950x3d, but they ran into the same problem of cross CCD cache hits causing a performance drop, that resulted in the decision to ship the 7950x3d in the way that they did.
Yep. You are 100% right. I know you know what was meant by that however I will go deeper into that and explain this for anyone else reading this thread…
What was happening was CCD latency. Aka CCD0 relaying information to CCD1 and vice versa. So 7950x3D + process lasso pretty much became boosted 7800x3D which is why in all the test without process lasso the 7800x3D outperformed the 7950x3D every time. The 7950x3D isnt perfect and if your game wakes up even one 1 core of CCD1 even for a moment you take a performance loss.
The 7800x3D obviously doesnt have this issue due to being single CCD. With that said in test where Process Lasso is used isolating the game to CCD0 and running all other task on CCD1? The 7950x3D out performed the 7800x3D everytime. It has more x3D cache which is huge in a game like wow and its clocks Boost higher.
On top of this you have 8 cores 100% dedicated to gaming and 8 other cores to do everything else you would want to do on a second monitor or just run background task on. So its still better than the 7800x3D even without the extra cache and higher boost clocks once you factor that in. Keep in mind this is ONLY true when using it with Process Lasso and most reviewers did NOT use Process Lasso when doing their review including Gamers Nexus and Hardware unBoxed. So once again sadly there is a LOT of misinformation out there on the 7950x3D
With that said I get why a lot of reviewers did not use it because they know doing so and making it feel like a requirement would be too much for the common consumer and I do not disagree with that logic in any way. It 100% is which is why I still recommend the 7800x3D to the majority of users
Given that I’m only using a 4080 for a 4k main monitor, the issues for the 2nd monitor seem to happen whenever I can’t seem to force the web browser/discord/whatever on the second monitors to use the IGP and it tries to run everything off the 4080, not due to the increased CPU load.
That was more of a message to streamers and people who do a lot of other things on a second monitor while gaming. Of course you’re not going to spend the money on a 7950x3D over a 7800x3D. That would be completely foolish if you are a pure gamer.
I know I said that in reply to one of your statements but I was talking more in general the response was not directed at you it was directed at anyone reading the thread. The comment I mentioned is just a nice perk on having it if you need the 16 cores outside of Gaming.
I do contact creation nowadays and I totally need cores outside of Gaming. It makes rendering my projects a lot faster. So if you are buying the 7950x3D anyway then yes it’s a nice perk being able to make those eight cores 100% dedicated to your game and nothing else
Yeah, and for non-gaming limits I’ve bumped, weirdly 32GB ram has been a bigger limiter to me than 8 hyper threaded CPU cores.
1 Like
Thank you for correcting me on that I don’t want my point to be misunderstood. It’s not enough of an uplift over the 7800x3D to justify the price tag on it. I didn’t want it to come off as me saying to buy it over the 7800x3D. Not what I was saying. I just wanted to clarify that
If you are a gamer 7800x3D is 100% what you want. However if you do some type of media creation or code compiling something that you need the 16 cores for outside of Gaming then yes the 7950x3D has some nice perks for you. But that’s someone who was looking at a 16-core CPU already. However if you are more Adobe focused Intel does have some nice options also
Just like how the 3090 tends to be a better video production card than the 4080 because of the extra vram.
Even gaming at 4k, I haven’t run into 16GB of vram being a limiting factor (I think the highest I’ve seen a game use is 12-13 GB).
If you are asking me.
Patriot Signature Line DDR4 Memory 16GB (2 x 16GB) 3200MHz (2 Rank Double-sided module) - PSD416G32002
I was actually asking Shifty about the DDR5 7800 kit, to try and test if memory bandwidth or latency is more important to wow.
Our current theory is that it’s memory latency (time between CPU requesting data and receiving it) that’s more important than bandwidth (how much data can be transferred in a given amount of time).
To match the latency of the kits we were talking about, you’d want something like 16-16-16-36 for timings (if you stay at DDR4 3200). Not that I’d recommend upgrading your only memory now (your computer has other components where you’d see a bigger impact from an upgrade), but if a CPU upgrade also requires a memory upgrade (IE: DDR5), then you’ll want to look at low latency memory if possible.
1 Like
Thank you. I’m learning a lot from this thread.
1 Like
Thats why I posted those long detailed responses. It wasnt for capslock it was for people reading this thread.
Update: New monitor is in… running much better than expected. Around 60fps in Valdrakken at native uhd, graphical settings @ 3. I’m shocked how well my rig is handling it, will mess around with setting some more.
I’m sure the gpu is holding you back a bit.
My razer blade with a similar cpu and laptop 3070 is able to push a tweaked 7 graphic setting at 3440x1440p which is just a step below 4k.
Eh, valdrakken is an absurdly CPU bound area, especially if you have environment detail or view distance turned up at all.
On my setup, I’m CPU bound in valdrakken in 4k with a desktop 4080.
Not disagreeing, just comparing my laptop since the cpu is similar to OP’s.
My previous laptop which had a 6700hq and 1060 struggled a bit in shadowland. I believe I was running at 4/5 setting with that @1440p.
I’d advocate for a stronger CPU as well as GPU if you’re serious about 4k, but this depends on your framerate doing the content you want. I upgraded from an i7-8700k to a 7800X3D, since I primarily raid, and that’s where framerate is hit the hardest (CPU-wise) outside of places like Valdrakken.
However, a GPU will make the bigger dent. WoW is not ultra GPU-intensive, for the most part. Ray Tracing will hit your FPS pretty hard, and if that’s a feature you want, I’d advise going Nvidia. However, it’s a premium to get that.
My 6700 XT handled the GPU-side of things very well on 1440p for WoW, but I’ve since upgraded to a 7900 XT for 1440p.
I don’t know what your budget/goals are, but for consistent 4k gaming, I don’t advise getting anything less than 16 gigs of VRAM. 12 is doable, but some newer games would go over the VRAM limit and demolish performance. Nvidia is a bit of a pitfall on this, as a lot of their mid-range cards still come with 8 gigs of VRAM for some reason, so make sure you don’t actually buy one locked to that. (There’s a few cheaper cards with 16 gig variants now, so don’t accidentally buy the 8 gig version–they’re ripoffs.)
WoW isn’t a benchmarked game, but you can at least find performance benches for all modern (and even some legacy) graphics cards for modern games. Channels like Gamers Nexus and Hardware Unboxed are robust in testing, while others like Ancient Gameplays and Daniel Owen cover a small scope of GPUs (as they’re small and independent, so what they can acquire is a little more limited sometimes).
Step 1 in this conversation should be asking about budget. lol
People are talking and suggesting the current top end cpus and high end gpus to a person who is only interested in playing wow.
lol
That’s like recommending a person buy a $100k pickup when they only drive once a week to get groceries.
And don’t get me started about this high end cpu is king for wow over that high end cpu that’s meaningless and in reality . 98% of the world won’t notice the performance difference between any of top 4 cpus as long as they are equally paired with a good gpu and ram . Wow is heavy on cpu that is truth. But a gpu matters a lot. I think some under play the gpu in this thread. Run the game on a very load end gpu from many years ago even with a 14900k tell me the gpu doesn’t matter, I know I’ve had a build before years ago where my gpu was way older than the cpu (by about 2 years) then I upgraded the gpu touched nothing else - vastly improved wow playing experience.
Just like the days people posted “350 fps is noticeable over 200 fps” bull.
For bragging rights you compare bench mark numbers but in a game like wow if you notice the difference and you have any cpu and gpu combo that ranks as “beyond mid” you’ll be fine. Wow is not a high spec game even full eye candy on.