Adding to the list. My Gigabyte 3080ti bricked exactly as explained above. Monitors went black stating no devices connected and GPU fans jumped to max. Card is now dead.
Just thought I’d throw in my hat.
Bricked my ASUS Strix RTX3080ti as well.
It was power by a Cosair RM850x
Ran the game in @100%powerdraw in 4k on 144Hzmonitor max setting getting 120-135ish FPS 99%@70-80FPS sitting around 350-380W nothing unussual, runs all games like that.
It was not a frame issue if the frame cap was uncapped, V-Sync or G-Sync would prevent from going above 144FPS. ANYTHING can handle 144FPS
My Crash was clearly an overcurrent thing. It has happend maybe 3times in 2 years before all in POE could trigger it from time to time when pressing the “Skill Button” to level up and instandly shutdown the system. But it happend mayby once during a league season completly random.
My GPU is a RTX 3060ti and I had one issue over the weekend. At one point on the first day of the beta, I had been playing for a few hours when suddenly I could feel the heat radiating off my PC tower like it was a fireplace and could smell that familiar smell of dust burning, like a heater that’s been off all summer and is being turned on for the first time in winter.
I quickly shut down the game and let everything chill for a while. Weirdest part was the fans on my GPU didn’t even seem to be running at the time. Once I exited the game they kicked on and everything cooled back down. After That I was able to play the rest of the weekend with no issues, though I kept religiously checking how much heat my pc was putting out.
PC is only about a year old and I’ve never had anything like that happen with any other game but it scared me half to death.
If anyone remembers the “New World” launch had the same issue. Ended up being the main menu screen had an uncapped frame rate which led to cards going flat chat. Exposed an issue with voltage regulator on some 30 series vid cards iirc.
This is so odd. I have a Nvidia Quadro gpu in a store bought rig and I was able to run Diablo 4 on max settings. Aside from latency when entering town from a town portal, I never had any issues and game ran pretty smoothly. I even played apex on my other monitor while waiting for my que to finish on the first day for over an hour. I have also played both D2R and New World on same pc never had issues. I hope Nvidia replaces everyone’s parts though.
easy there tiger.
My ASUS Strix 3080ti bit the bullet aswell.
Sadly EVGA has left the GPU market… buuut, I do recall during the great GPU apocalypse that was the New World launch, mainly EVGA FTW Cards were effected during that.
Might wanna climb down from your high horse before you hurt your self.
Safe from what? Getting a bad component? There’s nothing you can do to avoid getting a bad component. You’ll only find out when your system is pushed to the limit and the bad component fails. RMA the faulty hardware.
The thing is OP and others assume it is the game that hurt their card. When it could also be a poorly made video card.
3080 Tis are already known to be a strained design. Generally even the worst game behavior should be handled like a stress test and not physically fry a video card.
I could be wrong. Maybe there is some strange process a game could kick off that will harm a GPU, but I would guess its far more likely that poor cooling, card design and possibly components cause higher risk of failure at peek loads.
So you bought a cheap AIB gpu, played an unoptimized game beta and turned off the fps cap, and the game pushed your hardware to max usage (something it should be able to do without issue) and your card broke? And you think that’s Blizzard’s fault? If your poorly made card can’t handle running at max load without breaking that’s gigabytes fault. Next time buy a card from a legitimate brand. That’s why gigabyte cards are so cheap
New World didn’t cause the cards to die, poor soldering did. EVGA themselves even admitted it was their fault, and not the games. It would take you 3 seconds to Google it but instead you post misinformation.
Don’t be passive-aggressive and don’t spread lies, please. Gigabytes cards certainly aren’t the cheapest nor badly made graphics cards. It’s painfully obvious you don’t understand pcs, so if I were you I would refrain from making these kinds of posts, you are not helping anyone, and telling people they bought “cheap” 1000+USD card is beyond ridiculous.
IF you remove all fps limiting factors in settings such as FPS cap / Vsync / Gsync etc. pushing it to the limits without undervolting your card there is a big chance that your RM850X wont be enough to handle the transient spikes of a 3080ti under certain situations within the game. (loading screens/cinematics/some parts on the map)
There is nothing faulty about this, its just physics. Lets say you have a “basic” non overclocked 3080ti with a powerlimit of about 350w. Your 12v rail (not the whole PSU) would need to be able to get hit by atleast 2-2,5x of that load. Maybe even more so to be really safe lets say 3x to maybe around 1000w spikes relatively frequently. Your RM850x has 70A on its 12v rail and you need some of it for something else than the GPU.
So its 70A x 12v = 840w-ish thats all your PSU can do. Now if you draw 1000w over a certain amount of time frequently with that PSU for the GPU ALONE overcurrent protection most likely will happen.