The nvidia 30xx series main draw for the wow gamer aka you

So is it really worth to spend money on these bad boys? What’s it going to bring to wow that is so appealing to you?

Buy a $700 video card for WoW? Mmm, pass.

2 Likes

Say you want to run at 1440p ultra settings is it going to bring much more to the table than its predecessors? Also what about streaming?

That’s how much is going to cost up here for the 3070 and $900+ for the 3080. :face_vomiting:

I hear ya… So for wow you shouldn’t get it that’s what I’m getting from you both. But what about for some new titles and fps games that you might mess around on from time to time? I’m mainly looking at trying 1440p on highest settings. Btw I’m canadian to :slight_smile:

3070/3080 are worth it for WoW if:

A) Want to play at 1440p or higher at high refresh on ultra settings.

or

B) You want to actually use Ray Tracing in WoW, because currently WoW’s RTRC brings even a 2080 ti to its knee’s.

C: In other games you want to play at 1440p/144 or 4k reliably in every game.

AND

You already have a very strong CPU.

2 Likes

What would you reccomend for a good budget CPU to handle the 3070s feats?

3070 is supposed to be about equal to slightly better than the 2080 ti, with better RTX/DLSS performance.

That said, you could use the existing RTX 2080 ti as the benchmark for CPU choices.

This guy does good CPU tests for WoW:

The 3600 performs well and is only $200 - that said, it is more expensive than it’s been over the past 6 months.

The i5-10600k, which would position itself slightly above where the 8700k is on this list, is $278 and performs better (but requires a cooler, so another $50-80).

2 Likes

Right on I was already planning to get the 3600 and since I was planning on getting the 2070 super I think I’ll wait for the 3070 in october since shadowlands doesn’t come out until the 27th!

It’s a niche at least for now, but the 3080/3090 are very promising when it comes to 60FPS high resolution (5k+) gaming.

My work machine has as a 5120x2880 (5k) display and it’d be badass to be able to play games on it with decent settings without scaling. I’ve tried games on these displays and experience is great — pixels are so small that antialiasing is obsoleted, general detail is through the roof, and the distance into which you can see before it turns to a blur is considerably higher. The only downside is that you have to turn down settings to hold a solid 60FPS.

High refresh rates are nice, don’t get me wrong, but it’s not super beneficial for the types of games I play. I’d rather have the resolution boost.

I’ll be getting the 3080 because at present, my 5700 XT struggles in many games to get a sustained high refresh at 1440p.

In WoW it’s no big deal, but even then when particle effects are all over the place it will be GPU bound sometimes and not go above 90 or so at max settings.

In FF14, i use a lot of Gshade options and that really takes a toll. GPU limited at 1440p at around 80-90fps with my gshade on, 130+ off. I feel the 3080 will help me get high FPS in this game even with all my options enabled.

And then there’s stuff like ACOD…90fps just isn’t enough.

I prefer SSAA+CMAA over MSAA. Alas, super-sampling is incredibly taxing.

RTX shadows seem kinda neat too, but we’ll probably need to wait another generation or two for the performance impact to not be too unreasonable.

I just use FXAA - it’s basically free from a performance hit standpoint, and combined with my high PPI due to 24" display at 1440p, and Radeon Image Sharpening, I don’t really feel the need for high levels of AA in this game.

Why use FXAA instead of CMAA? They both have a low performance impact, but FXAA is far too blurry for my liking.

I spent the last 20 minutes testing all different AA settings and I can’t tell the difference in image clarity - although CMAA had worse edge smoothing, MSAAx2/4/8 didn’t get adequate edge smoothing until around 4x, and 8x had a large performance hit.

I think Radeon Image Sharpening (driver based post-processing graphics setting) allows me to use FXAA’s brute-force means of AA and tames down the over-blurring.

More of RIS:

https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Our take is that there are two key use cases for Image Sharpening: the first is for games that are ‘soft’ to begin with. Lots of titles these days are using temporal anti-aliasing or TAA, and that can often lead to a blurry presentation. [Radeon Image Sharpening (RIS) is a way to sharpen those games and get a crisper image.

To dive deeper into what CAS does, we’ll quote AMD directly: “because RIS is based on an algorithm that modulates the degree of sharpening depending on contrast, it clarifies interior object details while leaving high-contrast edges largely untouched.” They go on to say this prevents a number of artifacts you get with traditional sharpening.

For me, FXAA High + RIS 100% gave me more clarity than MSAAx2 and x4, but FXAA gave better edge smoothing than even MSAAx8 without a noticeable hit to clarity or performance.

For WoW, any high frequency 4c/8t CPU like the Ryzen 3300x, Intel 6700k+. If you want to play other games then something like the Ryzen 3600+ or Intel 8700 (non-K)+

As for your initial questions, to each their own. There are people playing WoW on a AMD 560 and happy as a clam. There are people playing WoW on a GTX 2080ti and miserable…and vice versa. For me personally, I will wait and see what the RTX3060 (all its variables) has to offer and it’s price point. I look at performance for my dollar and there is a point of diminishing returns in PC hardware performance.

1 Like

The 3080 comes out in a lil over a week. Makes it tempting. Although that’d be a 3k setup with everything not including taxe/shipping. :grimacing: It be a nice setup for cyberpunk though.

Here in my country, ASUS just dropped the prices of ROG Strix 2060-2070s to around USD $494. This is in the middle of the RTX 3xxx series hype all over the local PC/hardware social media channels.

Somehow, that tells me there may be more good deals for even just RTX 3060 or 3070 series some time soon…with that in mind, I think I will stick with my 1660ti for a while since I mostly just play WoW at 1080p anyway.

1 Like

That’s probably the smart move. I have a 2070 and play at 1440/144 and while the 30x0’s sound tempting, since I only play WoW I’m probably better off waiting another generation.

This could change if WoW ends up adding more ray tracing support like reflections and some of the other advanced lighting features without an insane performance hit.

We shall see…

There’s also the possibility that AMD’s big navi is a stupid value when it launches, which is what happened when the 5700 XT came out. Also, it could be in answer to THAT, Nvidia drops prices on the existing 3000 series and/or launches ti/super variants.