RAM - 32gb or 64gb?

I guess the bigger question is, can you?

My Windows 10 is using 4GB inclusive background applications, Steam, Bnet, Discord etc.

Browser 2GB as you mentioned.

So if people are lacking RAM they can close the browser.

I spent $150 on a second small i5 gen 6 laptop build for the second screen utility. For information pull up and stuff like that. Works great. Runs other older games too which is wild. It’s cpu graphics only from gen 6 lol.

16GB is enough. 32GB is already overkill.

Depends on usage, but yeah, 16GB is enough to game with.

Just picked up a nice 3070Ti on Ebay for $383, seller accepted my offer, they were asking $450. And many listings are wanting 500 or more.

Not bad. No way in hell am I plunking down a grand+ for a video card. These video card companies need to be taught a lesson, you can’t keep gouging your customers. And scalpers don’t help the situation either.

I mean look at how fast they dropped the price of the newly released 4060Ti… 4 hours after launch it was already $40 dollars off MSRP. And AMD took only a day to do the same with their new cards.

Anyway. 16GB Good. 32GB optimum. More than 32GB is definitely overkill.

1 Like

Sadly that’s the low end of their lineup. After seeing so many YT vids saying to the effect that gfx card prices were collapsing, a quick look at prices reveals otherwise.

During the shortage, scalpers showed nVidia how much people were willing to pay to satisfy their gaming addictions, so now they’re pricing accordingly. A 4080 is still about $1200 on the lowish end - the present price for 4070 ($899) was what the 80 would have gone for prior to the shortage.

you cant ge tthe 4k textures with 16 gig of ram it wouldent let you even press “high detailed textures” :slight_smile:

Well there’s good reason to think that the 50 series will be at a more consumer friendly price.

Nvidia is potentially looking at losing market share if they keep 50 series overpriced, and AMD will only further close the gap between their cards’ 4k and RT performance and RTX cards’ 4k and RT performance. The 40 series isn’t popular at all and has sold horribly, and a lot more people with older cards will be looking at upgrading by the time the 50 series comes out.

If they come out. Not sure Nvidia actually cares enough about the Graphic cards in general with their focus shifting to Ai.

Because they’ve invested billions into RnD, hold numerous patents, and have a near monopoly on the consumer gaming GPU market?

It’s safe to say they aren’t giving up their consumer GPU market anytime soon. They’re going to keep focusing on all markets from gaming GPUs, to workstations, to supercomputers.

I see it changes constantly. There was a point where the pricing did drop across the board for many cards, but then it has surged back up when everyone got to see the actual performance vs cost for the newer cards.

I had bought a 2060 a while back new for just over 300, and ended up selling it for less than 200. Price dropped that much after the 4080 series had dropped.

I would probably get more out of it now than I did then. So pricing is all over. eBay though you take some chances. This card I just bought is used, but supposedly in like new condition. That means a lot of different things, but still. Considering the average used price was between 500-700 for a 3070Ti, getting one for under 400 was a good deal I think.

Probably not, but who knows. They are still over pricing them though.

Resolution plays a part in the usage but I was giving a middle ground as 720p, 1080p, 1440p, and, 2160p (4K) have different system RAM requirements. It was more to give a general guide to help people figure out the amount of RAM they need. Most will be fine with 16GB playing on 1080p to 1440p lowish setting. Upper 1440p and higher and argument for 32GB can be made especially with Ray Tracing enabled. VRAM gets taxed harder but RAM still gets used.

1 Like

One of the very few situations where 64 GB will be an improvement over 32 GB is when a game eats so much VRAM that it spills into system RAM via Resizeable Bar (usable with Intel 9th Gen and later or AMD Zen 3 and later CPUs) and the game client itself uses enough to encroach on your RAM ceiling. That shouldn’t happen with D4 as long as they fixed the excessive VRAM usage when using High Textures settings.

They’ll need to since the AI bubble will burst just like the dotcom bubble burst and brought Cisco crashing down. In fact, nVidia’s valuation almost exactly matches the deltas seen by Cisco shortly before its stock lost 80% of its value and other startups took up the slack by offering equal or sometimes even better products than Cisco for less cost. nVidia’s in trouble if Intel’s Arc series keeps getting better at its current pace, as they have a lot of room left in driver optimizations to level the playing field, and AMD is closing the gap significantly with virtually equal rasterization performance at same price points/tiers. The only thing nVidia has going for it right now is DLSS 3 and its RT prowess. That won’t last forever and FSR 2.0 is a rather large step up from FSR 1.0.

you will likely be hit with vram or cpu before ram bottleneck at just 1440p in this game

blizzard at least optimizes its games decently.

Me too unfortunately I have a bad skeleton with crumbling bones don’t worry I live with it well.

Its for the days I can’t get out of bed.

Haven’t read what others wrote, but I’d assume they too agree that 32gb is good enough especially for Diablo and especially on laptop as you’d run into other bottlenecks before capping your RAM, so I’m just here to further support the idea that 32 is enough. Peace!

1 Like

Well, that remains to be seen with Diablo IV. Hopefully the launch version is better optimized than the previous beta/slams were.

4080 is finally starting to come down in price, and seems like it can frequently be gained for about $1000-1050 now. What I paid a few days ago myself. Which of course is still an absolutely absurd price.

More reason for staying overpriced sadly. Lots of people stay away for one generation, but will they be able to do it a second time?
AMD just seem to be repeating Nvidias overpricing, so at least initially, them catching up might not improve things much. Though presumably, if they stay competitive, price wars will happen at some point.

For the low to midrange GPUs it isn’t AMD that’s going to drive nVidia’s prices down, it’s Intel. Their gains have been very, very impressive with the Arc series and those drivers still have a ton of room for improvement whereas AMD’s drivers have matured and their only push against nVidia is if they finally decide to hit the Big Red Button™ and jack up the power to reach nVidia’s rasterization levels at the high end.

For the 4000 series, the higher end cards won’t come down in price until they stay on shelves long enough. That is not true of the 4060 Ti though. It’s such a grossly overpriced card, especially since its current edition is only 8 GB VRAM, that some retailers have proactively slashed pricing on them, even in places like Germany where this never happens. The 4080 is coming down first because of two things: Exceptionally slow sales and the very insulting “4080” 12 GB launch that was cancelled and rebranded as the 4070 Ti that should have been a 60 class card due to the cut memory bandwidth (and the 4060 Ti is really a 50 class card at best, which is why its price is so absurd).

Another note on the 4060 Ti: It runs with an x8 interface. It is not an x16 card. As such, coupled with its obscenely slow 128-bit memory bus width, it’s a card to be avoided. A 3060 Ti is going to be a better value unless you’re hard sold on having DLSS 3.0, because that is literally the only feature that has any chance of propping up this GPU.

Yeah that sounds much more likely. Really hope Intel succeeds with their Arc cards. Imagine having 3 competing companies for a piece of computer tech. Almost unheard of!

Seems like 4060ti is only slightly below msrp where I live, at $385 for the cheapest.