Anybody playing the upcoming Beta with the new RTX 4070?

I just splurged, since my video card is dated, and figured ‘If I’m updating, I’m going big’. Got a 4070 Ti.

Can’t wait to get a new power supply next month and get it all installed.

1 Like

I am running on a Nvidia GTX 1050 and can only upgrade up to a GTX 1080. Unless I find enough money for a new PC, I won’t be playing D4 on an RTX anytime soon. I did however just max out my RAM to 32GB DDR3 from 16GB DDR2 so that is something.

1 Like

The 4070 has a pcie 4.0 bus interface. It’ll work on a PCIe 3.0 slot, it will just be limited somewhat. PCIe 4.0 allows for 2x the speed of 3.0. I’m in that situation currently with my 3070ti and older motherboard, but I’m getting everything upgraded so I can hopefully hold off on a new gpu for another generation or 2.

This site will let you play around with different setups so that you can learn how different combinations of cards/cpu/resolutions affect each other when it comes to bottlenecking:

https://pc-builds.com/bottleneck-calculator
1 Like

I just refunded my 4070 because, as a friend pointed out; they tend to draw a ton of power and melt their connectors.

This isn’t correct.

They melt because the users aren’t inserting them properly. The new connector is oddly “difficult” to get it to fully snap/lock in. If the connector isn’t inserted properly and it starts to come loose, it can melt due to poor contact of the pins. But it’s 100% a user error. This happened to some people when 4090’s released and Nvidia investigated as well as Gamers Nexus, who did a very thorough investigation and determined it was user error.

I have a 4090 and if I OC and crank settings, I can send up to 520w, far more than what the 4070’s power draw is, down the same power cable and connector, and not have a single issue.

Sorry to say it, but your friend is misinformed.

Hell I Just went from a 1660 super to a 3060 got today, Give me 3 more years :laughing:

2 Likes

Fair but I did not want to risk it.
Plus, I was getting a 50% bottleneck at the CPU, and my mobo can’t house stronger than the 11th gen Intel I have.
So I switched to an RTX 3080.

I don’t know if it has all that Raytracing magical goodness, but it’s still stronger than my poor GTX 1070.

Also, there is a digital trends story from back in January, so he’s not misinformed.
I can’t link it, as I am not allowed to include links in my posts. -.-
Look for ‘RTX 4090 connectors are melting again, and this time there’s a major change’

It’s not a widespread problem, so it’s nothing to be concerned about imo. Articles just want clicks and views really.

But regardless, 3080 is still a good choice, you’ll love it for D4. Even with RT, you should get a good framerate and worst case, you can enable DLSS for a performance bump, if you need it.

Ohh, it has DLSS? I was just trying to upgrade, heard the melting news, did not want to risk it (I can’t afford to replace my card if it melts after buying it in the first place), and was suggested the 3080.

I’m not tech savvy; The last time I remember hearing about DLSS, some youtuber was paid to promote it, and showing the before/after of Minecraft by toggling a DLSS switch.
It looked like an entirely different game. :grin:

At least, I think it was DLSS? It was a few years ago.

1 Like

An RTX 3080 can run D4 at max settings in 4k without DLSS no problem.

You won’t need DLSS2 until they implement Ray-Tracing. DLSS was basically created to compensate for the performance hit that Ray Tracing brings but they also market it as a general performance increase even without RT.

Apparently the 30 series has Ray Tracing, too. If D4 implements it.
I just assumed it was a 40 series thing.

I’m also not going in 4K, I don’t think. My monitors are 1080p.

1 Like

Yup ray tracing and DLSS was introduced on the 2000 series, in 2018. Both technologies have come quite a long way since then. DLSS is quite hard to distinguish compared to native resolution nowadays. There are some games where it’s more noticeable in things like fine textures, like hair and fur, but for the most part, when actually just playing the game and not pixel peeping, you can’t tell DLSS is enabled. It does depend on the DLSS setting and what your output/native resolution is though. If your native resolution is 1080p and you use the “lowest” quality DLSS setting, it will be rendering the game internally at like 540p and at that point it will often look noticeably worse.

Thanks for that link! Will use that in conjunction with this other research!

The way I see it, I’ll be spending most of my time gaming Diablo 4 upon launch and for months beyond it. Ray Tracing isn’t going to be in the game during the time that I discover the game and fall in love with all the details the developers have created. I won’t truly know what the CPU/GPU benchmarks will look like with RT so I am gonna forget about that.

Now, my criteria are playing the game on ultra settings at 1440p with at least 100 FPS. So far, I think we only saw benchmarks for high settings in the last build, which was an unoptimized build so I’m taking the benchmarks with a grain of salt. This next beta I believe is build 1.0.0 so I can truly see how other CPUs and GPUs perform. Even then, once I find the most cost-effective CPU and GPU that can push those settings, I will probably get something that’s a step above just to build some contingency. We’ll likely not be seeing the most graphically demanding areas in Act I.

Now, I can use all I’ve learned from this thread to decide how to proceed. I’m quite sure my current CPU will bottleneck performance, which means I am looking at a new CPU, a new Motherboard, and depending on the motherboard, new RAM sticks. My friend told me that Microcenter has good bundle deals so I’ll probably be monitoring those right after the next beta. In addition, I will definitely need a new GPU. Which one I get is still up in the air at this point. I’ll definitely check to make sure my 650W power supply will be adequate.

My current monitor only has G-Sync technology. Can I get an AMD GPU and make it work without worrying about screen tearing? I’d like to keep my options open here as there are things I like about the AMD 6000 series and the NVIDIA 3000 and 4000 series.

I’ll have to figure out how to change my DLSS setting, once I have everything installed next month.
Man, the last time I installed a video card, it was literally ‘Push it in until you hear a click’.
I did not have to take photos of wires, or worry about overheating, or all this other stuff. :grin:

Granted, the last time I installed a video card was in 2009. I bought it from NewEgg, and it came dead. But the seller insisted it was user error and refused a replacement; Claimed I fried the card.
And I knew too little about cards to dispute that fact. Thankfully, I only spent a whopping $70 on the card. But I never bought from NewEgg again.

1 Like

4060 Ti supposedly is going to be on par with a 3070 for 399$

You guys have also inspired me. I’m gonna start a thread when next Beta starts, asking people to post their specs and performances. Aside from YouTube videos, if we can get a community thread going for this, it can be a good reference for players like myself to identify minimum hardware specs.

1 Like