Choosing 4070Ti or 4080 for Diablo 4 at 4K?

Hi all,

Just saw this deal of getting Diablo 4 for free if buying 40 series.

I am thinking of either 4070 TI (currently around $800) or 4080 (currently cheapest at $1190)

The other components are pretty standard: I7-13700K, Asus Z790-P PRIME, [G.Skil] PC5-4800 16Gx2 (to qualify for the combo deal), Samsung 980 Pro 2T [M.2 NVMe]. My monitor is KTC M27P20P (4K @ 165Hz)

Would like to get advices!

Thank you,

have you ever build a pc b4? cuz If u had I dont see how a sane person would ever think buying a 4070 especially for 4k

5 Likes

I am quoting 4070 Ti or 4080. What’s wrong here?

I would go for a 4080. Pay a little more but worth it.

How often do you expect to upgrade video card?

4080 will likely stay relevant for a while longer due to the higher VRAM. Certainly seems like the more obvious choice for 4k.

1 Like

4070(ti) aint a 4K card due to the fact the has 12GB Vram some will say but but dlss3 or w/e but that defeat the purpose of 4k so 4070ti is out of question, now the question is, 4080 worth it? I would say no and I would suggest u do some digging and search for a cheap 7900XTX and buy manually the D4 and see if its cheaper (they are pretty close cards no RT performance) if not then the answer is 4080

Looking at the cost-benefit ratio basically the only reason to go with Nvidia would be raytracing.
And for raytracing at 4k at a somewhat playable framerate you will probably want that 4090.

Otherwise simply pick the best AMD GPU you can get with your budget :woman_shrugging:

1 Like

The 4070s in general are a joke. Not worth it.

If you NEED to upgrade get the 4080.

If you can wait, wait for the 5xxx series, where the performance increase will be worth the price. (Google the leaked info, a little research is good for people)

2 Likes

As someone who’s built a ton of computers, done custom loops, etc, here’s the best advice you’ll get;

Buy the best one you can afford, and buy the most popular version of it. Reason being; the popular version is going to get more support, updates, and fixes than some niche one will, and you’ll get more life & performance from the best card you can afford.

Between a 4070TI and a 480? First off, forget that 4070 is even a thing. Nobody buys a xx70 card for high end gaming, period. Look around a site like Newegg for the 4080, find the ones with the most reviews, and pick from those.

4070 Standard is way more than enough for 4K application. 4070 Ti is by and far overkill, especially for Diablo. 4080 is just making you even further more future proof.

I own a 4070 Standard, running an LG C2 43" gaming TV at 4K / 120hz, and all games that I play maxed out and am constantly over 100 FPS.

Do not pay anyone any mind that claims the 4070 is a waste of money. They’re not aware of technical aspects of what they are talking about. As I have mentioned, I own a 4070 and I’m never dropping below 100 FPS with my average being in the ballpark of 145 FPS.

For $599 (provided the sale is still going on), the 4070 Standard is a hell of a steal. Granted, if you can afford the extra $300 and potentially $150ish for a PSU capable of 750 Watts, the 4070 TI is absolutely a must. The 4080 and 4090 are way too overpriced to even begin considering them.

1 Like

papa Jensen couldnt agree more LuL he sents his ty for the new jacket

Claims Nvidia’s 40 series are bad for 4K

Gets called out.

Deflects with the “lolcorporateshill” card.

You’re a class act fam. Thanks for your input.

The problem is that the RTX4080 is 3/4 of the price of a RTX4090, which makes the 4090 even better value. I would rather pay 2k for a 4090 than 1.5k for a 4080. Dunno how prices are in your country?

2 Likes

call out by who? u? soz bro u re just a clown… 4070 has THE SAME cudas as a 3070 (5888) A 2 years GPU WHICH IS NOT A 4K CARD the 4070 is AN inferior 3080 at a more pricy bundle the only advantage is the 2GB…
imagine saying 12GB is Value for 4K
the ENTRY for 4K is 16GB
ok? if u dont know plz read some

Congratulations, you can read tech specs.

What??? Please use proper English when you post.

Uh…
I’m not a fan of posting benchmarks as they are typically biased, but…
https://i.imgur.com/wAcRRzG.png
As you were saying…?

You must be one of “those” people who pretend they’re armchair computer technicians with a ton of spare time to misguide people… Only explanation I can come up with for the way you’ve been talking here. I’m hesitant on looking up your post history.

You mean the 4K resolution that’s been around since GTX V came out? The 4K that the 30 series is just fine handling?
Oh, I’m sorry if you’re not getting 200 FPS overhead at all times, but if a card is capable of pulling 120-150 FPS (that the 4070 Standard is very well capable of at max settings in virtually all games) it is 4K ready.

Please, just stop.

loses almost all the Vs if you are fine with your gpu fine by me but thinking 4070 is a good card your are plain wrong and naive

Your infinite knowledge on this is far too much for me to handle, oh armchair lord. Just going to leave this with you so you have something to sit on and stew over;

Depending on your hardware, your results will vary.
The 3080 and the 4070 perform almost identically, with the 4070 pulling ahead in various scenarios, but only by 3-5%.

But hey, you go right on ahead and buy a 4070TI for $300-$400 more and $150 for a new PSU to handle it. or $600 more for a 4080, or $1000 more for a 4090.

(ALL for more frames that you will never use this year or maybe not even next year, I might add. It’s all for future proofing so that they have use when newer higher end games come out)

Or better yet, buy an AMD GPU and be saddled with horrid driver support because that is the name of the game with AMD. Sure, you’ll be paying less but at the cost of never having stability?..

I was tempted to go even further and start breaking down the technical specs of graphics cards and comparing each GPU by dollar per performance gain and how each individual type of core on the GPUs behave and influence performance on a real-world scale, but you’re pretty intent on playing the armchair technician role.

I’m done famskies. I’ll make sure to just disregard anything you have to say from now on, so you can do your job misleading people.
:two_hearts:

1 Like

If you’re only aiming for 4k60 on Ultra, the 4070 Ti will work. However, its gimped 192-bit memory bus can be problematic in some scenarios when you’re trying to get refresh rates to match your native refresh at 4k. If you plan on keeping your GPU for a long time, see below.

For futureproofing and if you can afford it, the 4080 is the better choice. Be aware that you are paying a premium, and for the love of all that’s good, don’t get the FE version of the card. Aside from a noisy blower and relying on just a single fan as a point of failure, the AIB versions give you better thermal headroom and overall support (except Gigabyte - avoid them at all costs).

Something you do want to take into consideration is whether or not you intend to upgrade GPUs anytime within the next 2-3 years. The reason being that if you do intend to do so, the 4070 Ti will serve you better and save you a considerable amount vs. the 4080. However, if you intend to keep the GPU for a decent amount of time, the 4080 will serve you better. While the 4080 and 4090 both are bottlenecked by even the flagship CPUs (13900k/7950X) in many games, future CPU architctures will likely alleviate that issue, and having a card that could hit your native refresh at 4k on Ultra is a tempting affair.

Bottom line is your budget and how long you intend to keep the GPU. As with everything else in a computer build, balance is key and brute force doesn’t always win in the end. Factor in your personal use case scenarios and go from there.

Note: If you get a 4000 series GPU, I strongly recommend upgrading to an ATX 3.0 PSU with full PCIe 5.0 certification. If you use any 4000 series GPU with the 12VHPWR adapter on an ATX 2.x PSU, you will incur significantly higher ripple and thermals on the GPU’s VRMs, shortening their lifespan and driving up power usage. Native 12VHPWR avoids that and gives you cleaner power on a pin to pin basis so the power is going where it needs to, not where it can flow easiest to (the adapter has the cables soldered to a buss bar and then redistributed with no control over where the power flows). Also, due to the power spikes the 4080 can generate, a 1000W PSU is recommended to avoid any chance of spontaneous system shutdown due to tripping OCP.

2 Likes

It’s that bad? :smiley: Damn, Gigabyte has the cheapest 4080 where I live currently.

gigabyte and msi are rock bottom