I was curious as to what the possible minimum hardware requirements for Diablo II Resurrected would be. With release not too far off, I’m sure Blizzard and Vicarious Visions already has an idea of what the minimum hard requirements could be?
Here is my current setup, that runs Diablo III without issues:
OS : Windows 10 Pro 64-bit
CPU: AMD A10-7850K
RAM: 24GB, DDR3
Motherboard: MSI A88X-G43
Graphics : 1024MB ATI AMD Radeon R7 Graphics
System Drive : 465GB SanDisk SSD
Audio : AMD High Definition Audio Device
I’ll not be upgrading my rig until it is time for D4 to be released.
Will it be enough (I’m guessing yes) to run Diablo II Resurrected at 1920x1080?
D2R seemed quite tasking, I got a 3080 and I am not even sure how well it will run in 4k, if he is waiting for D4 and wants to play D2R he should probably already upgrade now because they said they at least skip a year with the new generation of GPUs with shortages being how they are
I’m less savvy with AMD’s line and how they hold up so I trust you Shadow when you say he needs to upgrade
yeah I saw it too, wanted to mention that until D4 will be out he will probably be able to buy the next generation of GPUs and be past the silicon shortage but he also wants to play D2R which means he probably should find a solution sooner then later
seeing youtube videos tells me if they optimize just a bit I shouldn’t be worried about running it in 4k at all but still, thinking how frozen orb affected systems back in the day when D2 was new might still pose a problem with lighting effects and the upgraded fire in D2R
and a side note, it tells me I couldn’t post with a link because I was quoting you, how are you able to attain permissions to post links haha?
Oh that’s right, I saw it live, it was awful, but he was also running OBS in the background, I also got a pc from around that time misscheatah mentioned, but it has 980 ti and an I7, I kind of want to test both these machines to see how it works, if you want ill make a video haha
I suspect the issue there was that he was streaming in addition to playing the game. I had no stuttering or anything at all. Super smooth. I also had nothing else running in the background besides Firefox and standard windows processes. Some of the other testers found the same thing - the Alpha played a lot better when run solo.
It is below the minimums but I wanted to see if it would run at all. It does…but not in a playable way. Sadness. The laptop is my main computer that I use for everything else. D2R is the first game that has forced me off of it.
And mine is Lenovo Legion 5 Gaming Laptop, 15.6" FHD (1920x1080) IPS Screen, AMD Ryzen 7 4800H Processor, 16GB DDR4, 512GB SSD, NVIDIA GTX 1660Ti, Windows 10 , I think I’m covered but…Things can and do change Frequently when least convenient
aren’t him having less issues after getting a different GPU lean more over to him being GPU bound instead?
as I recall he didn’t even notice anything was wrong for a large portion of the stream haha, which means he either too used to 24 fps or it was more due to stream quality being affected more then the quality of his own gameplay, chat was complaining non stop while he barely even noticed most of the time
That’s true, which is why I wasn’t sure if he just didn’t notice or was it just the stream haha
Just having to run those improved graphics would have me believed it would require more processor power either way, had a processor heat throttling on me before this laptop, dropping me from 120 fps into 15-25 ranges, was super annoying, which is way I have a 3080 as compensation for my other laptop
One thing you can do is download a PC performance benchmark and that will give you a percentage score for each component in your system. It will show you relative performance of CPU, GPU, memory, HDs, SSD, etc.
So for example, hypothetically, your graphics card might be a weak link.
As Shadow mentioned GeForce GTX 1080 Ti earlier, using that for the heck of it as a comparison, that card is at a +2143% relative performance to AMD Radeon R7 Graphics for example on one of the benchmark sites.
Anyway - you could run a benchmark just to get a feel for how your components stack up on a relative basis so you get some bang for the buck.
Other thing would be to just do a quick search on the right commodity volume price points for any components you buy. There’s almost always a sweet spot for good buys on volume components, those currently being mass-produced in qty, but if you go above that sweet spot you start paying quite a bit more for performance gains.
OP you can keep the SSD, but everything else will need to be replaced with more modern specs if you wish to have a smooth experience.
A good system does not need to cost an arm and a leg, some prebuilds have decent specs for the money. Look for something with a 6+ core CPU and don’t go below a GTX 1660 Super.
The only thing that most prebuilds get wrong is that they come with single RAM configs which chokes the CPU in many modern games, so be on the lookout for a dual channel RAM config if you go the prebuilt route.
I’d be willing to bet that the tech alpha had some debug code running in order to help facilitate data gathering. Drothvader had to essentially shut everything else down on his computer before he could get a stable framerate going. Most of his software is development related, which tells me that there were conflicts going on, as a debug client would definitely slow down when development software was running background daemons.
The other issue is the concurrent dual engine running in the game. It’s very neat from a demonstration perspective, but hardly practical from a final product perspective. I suspect that ultimately they’re going to have to switch to a system that lets you swap graphics engines, but only with a client restart like WoW would when switching between legacy DX11 and its default DX12 hybrid mode. Blizzard needed concurrent engines running during the alpha to showcase the side by side differences in real time, but practicality has to win here and that’s one feature that doesn’t need to be there “just to be there”. Eliminating nonessential systems running in the game is paramount if they want to be able to cast the widest net for playable framerates on various hardware. It shouldn’t take a GTX 980 Ti to do that with this game, but it did.
Your CPU isn’t so much an issue, but the GPU is. However, Blizzard/VV could make use of the hybrid tech that Overwatch uses where the game world is rendered with the dGPU (discrete GPU) and game’s HUD and menu system are rendered with the IGP, in your case the Intel HD 4000. That would make even more hardware viable for D2R.
x264 has been on the hardware encoder for GPUs for years now. It isn’t as efficient as the HEVC encoders, but it shouldn’t be enough to bring down D2R’s framerates that badly. This was just a horribly optimized alpha, which is to be expected given it was a tech alpha and likely running with some debug flags set for data collection. One super interesting thing Drothvader discovered was that the PC UI stuttered like mad, but the controller based UI was butter smooth. That’s definitely an oddity that needs to be addressed as the majority of players will be using the PC UI setup.
If you are using the built in display and it is not capable of >60 Hz refresh, why are you running uncapped? There is literally zero benefit to running higher framerates than your display can show. In fact, all that does is cause your CPU and GPU to needlessly spend cycles rendering data that will never be seen, genearing significantly more heat than necessary. Cap your FPS and/or use VSync. “I can run 180 FPS in other games lulz” mindsets are the real issue in many cases. Sure, you can run 180 FPS, but if your display is limited to 60 Hz, you’re doing the equivalent of forcing a single GPU to drive three 60 Hz displays simultaneously. Optimize your settings for the display you’re using and suddenly you stop thermal throttling.
This is an especially aggravating mindset from Mac users that don’t realize their displays can only go to 60 Hz, yet they keep their framerates uncapped, wasting CPU and GPU cycles and generating enough heat to cook their machines (Apple’s firmware defaults will keep the fans at low RPM or even off until the CPU or GPU reaches north of 90° C, which is just freaking stupid for the lifespan of hardware).
For the OP, I looked at the specs of both CPUs, and it may be that his CPU really is below the minimum, but not so much due to clock rate as it might be due to the game utilizing instruction sets not present in the slightly older CPU he has, such as AVX, etc. Granted, older AMD CPUs have very inefficient cores relative to what they offer with their Ryzen 5000 and later series that are now capable of absolutely curbstomping Intel’s best CPUs in many games. If I didn’t need Intel for hackintosh compatibility, I’d be building a sweet, sweet PCIe 4.0 AMD 5950X system.
I’d hang tight until the second multiplayer tech alpha is released. It should have better optimized code by then. I really do think that making the graphics switchable and not concurrent would free up a ton of CPU/GPU though. There’s no reason other than comparison videos to have such a system running.
why did you assume my Screen only has 60hz range in the first place
here is one of my quote from another thread:
The laptop I was referring to was outputting 120 fps into a 120 native refresh rate screen, my current laptop has a 300 refresh rate screen although I don’t use it, I use a 65 inch 4k tv for gaming, it might not be as a good, but I prefer to play games on my main screen instead of my laptop, it is capable of 60hz refresh rate at 4k which is all I need to expect from a game running something as tasking as 4k
my current setup is:
AMD Ryzen™ 7 5800H Processor 3.2 GHz (16M Cache, up to 4.4 GHz)
Nvidia GeForce RTX 3080 mobile
32GB DDR4-3200 SO-DIMM (RAM)
1TB M.2 NVMe™ PCIe® 3.0 SSD
Was not planning on flexing with my specs but you were making a lot of assumptions about why my rig was Heat throttling
edit: ill add that the issue the previous laptop with the heat throttling was the way it was designed, the mother board was replaced 2 times with the same issue recurring and nothing to do with how I used it, which is why they supplied me with my new laptop as compensation
What model laptop do you have? I’ve not heard of any with that high a refresh rate. Haven’t even seen desktop displays that go that high.
No, H.265 (HEVC) specifically. nVidia did implement that into NVENC, but it’s supported at the hardware level on all mid-tier and high end GPUs from ~2017 onward. Not sure why he was using software encoding since that eats CPU cycles. That’s really only a viable option on systems with ≥8 cores and games that use no more than four cores, or on HEDT systems that have lots of cores and bandwidth.