This game has almost melted my GPU twice!

That might be a maintenance issue and not an optimization issue.

Especially since it is a laptop. A little dust could be blocking airflow leading to overheating video card.

Personally I had no overheating problem (card was 40-50 C the entire playtime) although there was a lot more hot air coming out of the case than I thought the graphics would warrant given it has no RTX. If this were summer and the case couldn’t dump the hot air into a cold room I’m sure my temps would have been hotter.

i have a 1080 and everything runs just fine lol.

I do notice that at times my GPU sounds like it is going nuts (fans). Even if I’m just standing around in the main town - usually if there are weather effects.

Then if I just open the map, the fans ramp down immediately.

GPU didn’t die, no crashes (of pc/windows), the beta itself did crash or disconnect on its own a few times - but not related to GPU as far as I could tell.

Just my observations.

You most likely have a faulty GPU. A game should never ever bring your GPU to a point of thermal shutdown.

People blamed New World for a problem that was entirely on NVIDIA/manufacturers. It’s not game related.

It’s not game related, and you shouldn’t be posting on these forums. You should be contacting NVIDIA and trying to determine if you have a faulty card or not.

1 Like

I think you are close there.

There’s less than a dozen TVs that i can find that support 3840x2160 @ 120hz. And they pretty much are all 2022 models.
4096x2160 i couldn’t find any.

So if you want 4k resolution it’s far cheaper to by a TV per inch of screen, but you’ll be limited to 60fps unless you pick one of the current top models between 800 and 2000.

Now if you want to play at 1440p or 1080p you can get a TV to do that at 120 or maybe even 240 for next to nothing.

The response time for most TVs is still sub 10ms (most have game modes now). So unless you’re playing competitively (where you’d probably be on a 20" monitor anyway) a TV is just fine.

Most consoles won’t run more than 60fps regardless of resolution either, so this is only really a PC problem.

1 Like

From what im hearing around the problem is specific to the geforce 3000 cards from the Gigabyte manufacturer whom is known for shoddy workmanship. Its not the diablo beta.

I kept crashing because Diablo 4 ran out of memory issue. I really hope they fix it as I do not want to damage my computer either. Its brand new.

Same, 1660 super. Game ran pretty good. then again, I never know how games look how they are supposed to look with super high settings. haha

my 3080ti didnt have crashes or extreme temps. just performances under the expectations.

for a game that recommends a 970 teehee

apparently the game likes older cards more or somehting. lots of happy 1000-2000 series owners ; lots of unhappy 3000 series owners.

idk about 4000 series didnt see much about it.

i wonder what the QA team used while testing?

maybe nvidia will release a special driver with d4 in mind just before release. they do that sometimes for certain games. knocks on wood

TLDW: Just like New World, game can’t physically brick your card, it can only expose issues with the card.

1 Like

And its usually shoddy gigabyte cards.

2 Likes

They did release a new driver for beta. Mine was literally only a couple of weeks old and i had to install a new one for beta.

There’ll be another one for release too probably i don’t doubt that. But not everyones issue is software. Some just don’t have their hardware put together well.

I see people saying that their machine is fine because their AMD only runs at 65C… but that’s basically as high as it’ll run before it melts.

Can’t blame manufacturers for an individual going cheap on cooling.
Like why would you buy a $2000 GPU and put it in a machine with no airflow no water cooling loads of heat generating items in a house that doesn’t have A/C?

Gigabyte is just horrible on both ends, quality and customer support, I will never buy a gigabyte product

1 Like

I have an EVGA 3080 Ti. With max settings @ 4k locked to 60 fps, the actual GPU load was extremely low. At no point did the game tax my GPU higher than 700 mhz and never reached temps over 50c. System ram and Vram were ridiculous though at 29 gigs and 11.9 gigs the whole time.

If you play without vsync or an unlocked frame rate and your card has internal flaws, any game will eventually discover those flaws if you don’t routinely benchmark or push your GPU to the limit.

I’m shocked how many games on a first boot, don’t have vsync enabled by default to prevent 100% gpu usage that may cause instability especially on older hardware, I see it all the time.

I never play with vsync on, cards are meant to be run at or near 100% usage

True if your monitor can support that frame rate. But if it can’t, so essentially your just wasting power, heating your room with an over priced electric heater. If you monitor can only support 60hz there is really no reason to push your GPU to 100% for frames your monitor can’t even show. Unless of course you enjoy all that screen tearing for some reason.

Although I guess you can make a claim that trying to push 3000 fps will give you some kind of competitive input lag advantage.

Unfortunately EVGA is out of the GPU manufacturing game because of mistreatment by Nvidia.

I run 144hz and generally max settings, so most games don’t hit that. Or should i say… most NEW games.

I have one of the last 3070ti’s EVGA ever made lol.

1 Like

Well you can still use Vsync with Gsync/Fsync to reduce tearing, unless you can’t get close to 144fps on a newer game.

What fps did you cap out at with your 3070 ti in D4?