Nvidia trying to be stingy with VRAM…
Lets hope more games will tell them to stop it.
Nvidia trying to be stingy with VRAM…
Lets hope more games will tell them to stop it.
They’ve improved the amount of VRAM on their lower end cards. The 4070ti for example has 12GB which is 50% more than the old 3070ti.
Although that said, it costs 50% more too
Yup, no increase of price/performance this round.
Buyers are voting with their wallets. Hope got price drop.
This is another case of where I am suspicious of the TPU testing. They cited VRAM usage…how did they determine that? By using a card with 16GB of VRAM?
Because that isn’t necessarily how VRAM usage works. The GPU may let a lot of stuff sit in the buffer even if its not needed just for increased performance, even if it’s flagged to be reused for other textures.
When VRAM limit is actually reached, you are able to tell by the performance taking a dip when the game has to switch to main system memory or page file.
The performance story is contradictory to the VRAM usage graph. If Ultra actually needed the memory, you would see more relative performance drop off points relative to class as you increase resolution. But it seems to scale pretty close regardless.
Yup, got app to check VRAM usage. Like MSI Afterburner.
Exactly and that behavior would manifest on a 16GB or higher GPU because…it can.
GPUs with less memory will not show the same usage for sure, because it will manage the memory as needed real time, and it would show up in the performance.
The 3080 10GB and 6800XT in raster are basically equal here, and yet as you increase resolution they continue to perform the same despite the 3080 having a 6GB memory deficiency. If VRAM was truly a limitation, as you go up in resolution the equally performing card with less VRAM would likely suffer.
I’m not saying more VRAM is bad or unnecessary, because it can be in some cases, but not here. And TPU listing VRAM usage is useless information that does more to confuse than educate.
With consoles rocking 16GB. I would advise more VRAM for 2023 rigs.
It’s gonna be a rough ride for them. Right now, the only Nvidia options are the older 3080ti -3090 cards, all over $800-900 market price, or 4070ti at over $800. Then of course the upper end. I wouldn’t advise getting a 3060 right now at all; it falls short in plain raster even if it has 12gb, and they won’t save it in the future from being just too slow. It can’t reliability do ray tracing either, so there are no real reasons to choose it.
The 6700xt is a good option, especially at the price point. It performs more favorably to the 3070, and costs less than a 3060. It can’t really do ray tracing either, but he’ll for $359 you can’t ask for much imo. There are too few 6800/XT/6900/6950XT below $650; anything more and you are better off spending a little extra for the 7900XT at $879…and at that point you are again wondering if it’s better to send the money on a $1199 7900XT or 4080 16GB.
So by saying, “get 12gh in 2023”, you’re saying mostly to spend $900 on a GPU.
6700xt makes sense, 3080 12gb/ti ->3090/ti are too expensive for old hardware and you’re better off buying 4070ti+ or 7900xt +
6700 XT is fine. As long as you don’t touch RT.
I turn it off most of the time with my 3080.
Been very satisfied with the Radeon 6600 I got last year for $249. Performs great at 1080p, doesn’t need more than 8gb of VRAM since it will only ever be for 1080p, and also is too slow for Ray Tracing anyway.
And as far as consoles go, isn’t it 16gb shared memory? Their systems are all APUs right?
Yeah, 16GB is shared. Still, GDDR though.
uses and needs are two different things, just because a game can use 12GBs of video RAM does not mean it needs it. See Sal’s charts
two completely different things, it’s apples and oranges. A lot of people make this mistake about console tech and PC tech
it’s a silly mistake made by people who don’t understand tech, similar to you need “X” amount of cores to game. GPUs (and CPUs) are a whole and are based on performance for that whole unit. If the GPU chip can’t utilize the video RAM than adding more RAM does not make the card any faster. Get the GPU that gives you the performance you want in the games you play regardless of one specific part of it.
Games utilizing more VRAM will increase. Especially at 2560x1440 and higher.
definitely, but better performing cards will still perform better on those games than cards with worse performance but more RAM. The RTX 3060ti is a better card than the RTX 3060 (and i have an RTX 3060 in one of my PCs). Always has been, always will be. Will there be a game five years from now where the RTX 3060ti gives you 22 FPS while the RTX 3060 12 GB gives you 23 FPS because of RAM? Sure, and you can take a victory lap about owning a card that gives you one more FPS in a game you waited five years to play on a card the rest of the world as moved on from years prior.
buying a 3070 is a mistake. 4000 series already out and has massive improvements over the 3000 series. Just saying, not sure if you are getting a good deal or something with that card… Good luck!
Yeah, and just like people have been on the 2060 and 1660s with only 6GB VRAM in those scenarios, they’ll need to reduce textures and resolution and increase upscaling anyway because regardless of VRAM they’re too slow. As in your example, nobody is going to settle for 22 or 23 FPS. They’ll drop settings until they at the very least can get a locked 30fps experience…in which cases texture quality already is down and thus solved is the VRAM issue. Unless the GPUs have an unreasonable amount of memory, or some strange memory configuration (DDR4 1030s lookin at u) then 8gb is fine for most cards playing at 1440p and below. And as always, you can reduce texture quality.
And along with massive improvements come massive increases in price. Right now, the minimum buy in for 4000 series cards is over $800. For most people, that’s not a realistic entry point. AMD currently controls the sub $400 market HANDILY. There is virtually no competition to the 6700XT for price-performance. And the vanilla 6600 when on sale dominates the budget segment, which is high performance 1080p gaming at around $249 ($200 on sale, and often with promo game codes). The 3050 is a joke. It costs more than the 6600 and performs much worse.
I’ve seen this down to low $400’s, I know in certain games the card gets killed by ray tracing but how does it do in WoW? I’ve yet to see any tests but really haven’t been looking for them.
Here you go, $369 all day every day. There’s several under $400 regularly across different retailers…and yeah in Ray Tracing these cards aren’t great. But you aren’t getting an Nvidia card in that price segment that is competent at it either.