My second PC has a 960 GTX running on an Intel i5 4690 with 16GB RAM - I’m just curious how the beta ran for you guys? If you have any 9 series GTX card, I’d be keen on your input.
I didn’t want to install the thing twice since I’d much rather test it on my current rig.
I could nitpick and say that nVidia weren’t exactly being truthful about what the 3090 was capable of in gaming, but that’s neither here or there.
I picked up a Sapphire Nitro 5700XT a couple months before the current gen cards ‘released’ for about $570 US, new in box from the PC store. My initial regret for not waiting very quickly turned into relief as the GPU market tanked.
That card now sells for close to $800 US used on eBay.
I’m playing on a Dell Inspiron 15 7559 with GTX 960M and 16GB RAM (extended it myself). I needed to do a lot of net searching to try and optimize my settings to get the game to run smoothly enough for me. After the tweaks, I’m having 50-60 FPS in town and it can drop to about 34 FPS in crowded Fallen camps in Act 1.
I’m hoping for optimizations though, since it’s a bit unbelievable that my setup, which has no visible problems with other modern titles, suddenly trips over a D2 remaster.
3600+gtx1050 on 1080p here.
difference between low to high settings isn’t big at all (5-10fps). I have around 35-45fps with all tested settings.
I have 60fps (vsync on) If I play with classic graphic, which is very fluently.
Wow, and that’s got an i7 chip in it too. What screen resolution / hz reading are you playing on, out of curiosity?
The 1050 destroys the 9 series, champ. I upgraded a few years ago from the 960 Ti to the 1060 GTX - worlds apart in difference. But cheers also for the info - a family member of mine has my old 1060 in their PC.
And that’s the strange part - many owners of lower end GPU’s aren’t reporting high temperatures. It seems to be isolated to 20/30 series nVidia’s and 50/60 series Radeon’s.
Actually, I have the i5 version.
My display doesn’t support more than 60Hz and I’m playing the game in windowed mode on 1280x768.
The reason for that being twofold - old habits of playing it in a window and I just find it easier to keep track of merc/summons without looking all the way to the corner of the screen.
After modifying the game executable in the nVidia Panel for maximum performance, I modified the game settings like so:
I’m hoping the final release will be a bit better for that. At the moment it reminds me of rust the way it works your pc out. Which to be honest surprised me. I would have thought D2R would’ve been fairly easy going on the requirements
For me on GTX 1050 that is kinda similar to GTX 960 performance wise the game runs smooth in 1920x1200 100% resolution scale on medium settings. I slightly overclocked my GPU in MSI afterburner and I ended up geting ~10-15% FPS boost. In solo games on ~medium settings i got like 40-55 FPS in open areas and 70+ FPS in closed, on low settings as far as I remember i had 60+ FPS everywhere. Tho in public games the performance went a little down, but not sure if it’s a graphics issue or the fact of not having SSD that is literally mandatory for the game. HDD causes small lag spikes from time to time and loading screen times literally makes D2R unplayable in a long run. That’s why I ordered SSD and hopefully will try it out in the second phase of the beta.
Not really, the 1050 is similar to the 960 which isn’t great for a x60 card. The difference between the 1050/960 and the 1060/1070 is pretty significant.
I have a 1070ti and I run the game at 60-100fps at max settings and 1080p but card goes to 80c+. Turning internal resolution to 4k drops to ~45fps.
I hope the game will be more optimized at release.
Very smooth and playable on my 970 and 4690K.
I got around 80 FPS average running around in Blood Moor killing stuff, for example. (Default medium settings)
I believe MrLLama (D2 streamer & speedrunner) also has a 970.