WoW is ridiculously CPU dependant

No kidding on the over kill . I’m using 16GB of DDR4 at 2666 and I can play games like Doom Eternal with no problem . WoW compared to some games now adays is like playing on an old Atari 2600 . Doesn’t need much

Insignificant and largely irrelevant improvements in so far as this game is concerned… but sure, there will technically be improvements.

Granted, you won’t actually be able to notice these improvements and will have to run benchmark tests to even know for certain that a difference was made… but sure. Improvements.

Actually the game has had multi core awareness and capability for a very long time since the Pandaria days, what happend in BFA was further segregation of processes within the game and making it so the main thread didn’t need to be in sync with things such as sound processing and UI elements

That’s why it was labelled as “improved multicore support”

While that is true, there’s a difference between the game running, and the game running well. That’s what I was alluding to somewhat in my post above yours with the line of it took them until BFA to utilize more than a single core of a processor efficiently. WoW has always been a game that ran on pretty much everything, but ran well on pretty much nothing. It’s less of a problem now than it used to be but WoW runs fine on everything at a minimum by design, just the framework also makes it ridiculously demanding to run well on anything because of the compromises to make the former statement a reality.

It also doesn’t help when they try to update the minimum requirements people throw a fit and they have to work with that. Shadowlands was going to require an SSD and God, people went ballistic. My question on that is who the hell DOESN’T have an SSD these days or an nvme? They aren’t exactly as expensive as they used to be. 2012 I think when I got mine, maybe 2013, it was a lot pricier than it is now. They’re like 100 dollars for a whole terabyte. I think when I got mine it was like half the size and double the price from what they are today.

According to some computerworld article from 2015, in 2012 they were 99 cents per GB on an SSD. 2015 it was 39 cents per GB.

2 Likes

Ah fair enough. Either way it was basically focusing hard on a single core and barely using the others.

Because the main thread was in lockstep with other processes so it had to wait, then the addition of low overhead draw instructions really helped.

The game isn’t a intense on the GPU because of how things are run by the game’s engine, and because of improvements to things like Directx. It allows to front a lot of the load onto the CPU too. This game used to be regarded as godly when it came to performance too but now each generation/expansion out of this game requires a bit more. Gotta really rely on the m.2 SSDs, the newest CPUs, sometimes GPU when it comes to resolution because 6gb vram used (At the time) to handle ALL things at highest settings until you crank up the resolution. CPU is always needed to help calculate everything that’s viewed at once each frame.

I remembered back when I was coming to Cataclysm on my laptop, didn’t handle it well until I built my first PC, that problem was GONE. Then comes Pandaria and boy, I was given the ultimatum to upgrade my GPU. because my CPU was not the issue, at the time.

I’m still running an i7-3770k. Ivy Bridge is getting old as hell now. I don’t have any FPS issues, but I have a GTX 1080 which I think is still pretty good considering its age. Running at 1440p with like 8/10 graphics.

OP Ram timings don’t matter that much on intel. Not like they do on Ryzen anyway. However Ram speed is crucial with Intel CPUs.

Intel CPUs drink up Ram speed like water in the desert. I would heavily recommend putting in 4000mhz or faster kit. It made a pretty big difference when I was on intel.

Also there’s a specific section for threads like this and it’s called off topic hardware and gaming. A lot of us tech nerds monitor that section and I would highly recommend moving this thread to there

Microsoft is CPU.

1 Like

How many frames will wow get with the rtx 4090?

Why hasn’t this been moved to the hardware forum?

1 Like

To be fair though, this gen of consoles are pretty respectable vs. current PC hardware. IIRC the PS5 is roughly equivalent to a PC with a Ryzen 3700X and RTX 2070 which isn’t cutting edge but nothing to sneeze at. Probably a good deal better than what most WoW players are running, especially taking those consoles’ 5500Mb/s PCI-E 4.0 storage into account. Add in that there isn’t a huge fat operating system like Windows and god knows what else sucking up resources in the background and it’s easy to see how games can look great on them.

WoW is in dire need of optimization though, no question there.

1 Like

Absolutely, but WoW even looks bad up against PS4 games lol

1 Like

Take out your 1660 and use integrated graphics and see how it runs. lol

Why would you want WoW to be more gpu dependent when gpu prices are so ridiculous and supply is so scarce?

Not really, if OP or anyone else is concerned about lower FPS during raid or any high unit count scenarios (towns, hubs, PvP), I wouldn’t call getting 10-15 more fps added to your minimum frame rate insignificant; its way more significant than having a higher average fps (while this does directly increase that value).

Improvements to minimum fps are the most impactful gains you can get in a cpu limited scenario.

1 Like

I think there was something set up wrong with your old system of it was just doing 25fps in raid on lowest settings. If your mobo went then that prob proves that.

77 posts and I’m still not sure if this game is actually a CPU fiend. Can someone confirm with statistics? Seems silly the game would be destroying modern day CPU’s unless you have 5 million addons running.

You don’t have to be at 100% util for something to be CPU bound

Sure. While soloing Hogger.

Hell hath no pretentiousness like a geeked out nerd determined to seem smart.