You can extract the game files so that they run uncompressed and thus load faster, but be aware that it’s not an officially supported method and also it occasionally causes visual bugs (wrong floor textures, models going crazy with expanding and contorting, very rarely can cause rapidly changing rainbow colors that would likely be seizure inducing for people with epilepsy). So yeah if you’ve got epilepsy I’d recommend against this method.
D2R load times is also based on framerate for who knows what reason, so 60 FPS will load slower than 100 or 144 FPS for example. I personally wouldn’t run the game faster than my monitor’s refresh rate but if load times are important then you can.
Also I think loading times are mostly impacted by the CPU (at least in my case it’s the bottleneck), so if you’ve got an underpowered CPU then upgrading that could help.
Well there are two types of solid-state drives… SATA and NVMe.
SATA will be capped at 540 MB/sec due to the SATA bandwidth limitation, whereas the latest NVMe types can reach up to 7500 MB/sec.
Saying you have a powerful gaming PC doesn’t say much unless you list the specifications. For all I know, you could have a powerful gaming PC… From 7 years ago…
extract game files i did this and the load times 99% of the time feel like single player. I play on a Msi gs75 stealth laptop iz-9750 intel geforce rtx 2080 max-q
The largest factor in the load times, as evidenced by people utilizing unpacking techniques, is the CASC encapsulation system. At first, when the game is freshly installed, loading isn’t too bad. It isn’t “fast” by any means, but it isn’t ultra slow either. But as patching occurs and index/data files are appended, things slow down in a pretty big way. Short of unpacking the compressed files, the best way to keep loads to a minimum is to, sadly, do a fresh install every few months.
That’s silly. You’d think there’d be some sort of after-patch or routine maintenance that occurs to help minimize the loading performance loss from the addition of patches over time. I know the prices of SSDs are dirt cheap right now, but still, people shouldn’t have to waste their SSD write cycles for something like this which could be maintained.
It’d be nice if during installation, options were given as far as installation size:
Compact: Small installation size, slower loading.
Normal: Medium installation size, normal loading.
Uncompressed: Large installation size, fastest loading.
Here’s the thing: CASC can’t be cleaned up like that without reinstalling. It literally cannot. It can’t even, nor can it odd either. It has to do with how the data is stored and appended. In order to achieve a “fresh” install’s speed, both the data and index files have to be contiguous. That means all invalidated data is removed, and the files rebuilt. Guess what that’s going to do with that shiny SSD of yours. Yep, wear it out.
When CASC files are updated, the data files themselves often (but not always) retain the invalidated data, and the new data is appended. Whenever this is done, the index file has to be appended, both to add in the new node references and to invalidate the old node references. And in a fun twist, this is unique for every installation. Yep, the patching occurs differently for every single person. Do you really want to try and rebuild the CASC database under those circumstances? No, you don’t. That’s one of the reasons why scan and repair fails so often. It can’t repair a lot of problems encountered under CASC. Instead, it will, you guessed it, literally replace those files. In extreme cases where the end user has not done a fresh reinstall for a very long time (i.e., more people than you think), those data and index files are so riddled with changes that there is literally no feasible way to fix them without just flat out replacing them.
Then there is the enumeration issue with CASC. CASC shares an Achilles’ Heel with another filesystem: APFS. Both use a damn near identical enumeration system, wherein as a file is patched, a new b-tree/extents node dataset is added to the file’s index (ironically files are now indexed by files…which is horribly inefficient for gaming, but that’s a gripe for another time). As a data file is patched more and more, the amount of enumerated index nodes increases, oftentimes, dramatically. A single patch cycle, due to how CASC (and APFS) work, can actually create dozens, and even hundreds of new nodes for each and every one of its data files. As you’ve probably surmised by now, having to keep track of that much data is not conducive to speedy operation. Once files are that fragmented, the only way to recover the lost speed is to literally wipe them and do a fresh install, because that is the only way to re-establish a set of fully contiguous, minimally enumerated files.
CASC is far more complex than you’re giving it credit for, and no, it isn’t efficient at all. It allows things such as slipstreaming a patch while a game is live (WoW does this sometimes, as does D3 on rare occasions), but unlike other older filesystems, the features it brings comes at a cost, and that’s speed. And that speed degrades over time, which is why a reinstall every few months is recommended.
The good news? Reinstalling couldn’t be easier. You don’t use the Uninstall button in the Battle.net app. In a grand twist of irony, manually deleteing the topmost /Data folder in any of Blizzard’s main game folder(s) nukes the game files (e.g. C:\Diablo III\Data). Delete that folder and then launch the Battle.net app after you’ve emptied the recycle bin/trash, and just click the Update button. Yep, when deleting the files manually, the Play button becomes Update, and it just downloads a fresh, contiguous set of files for maximum performance, unpacking/uncompressing notwithstanding. The beauty of doing a reinstall this way is it actually preserves your game’s settings, and in the case of WoW, all of the addons and their settings as well.
This is also part of why I tell people not to use the beta folders for a base for the launch day or future retail patch cycle rollouts. Performance will be heavily degraded if doing so, with the other part of why I tell people this being that corrupted/invalidated files can and do cause issues when using them as a base.
If you want full speed (sans unpacking) with CASC, the only way to get it is with a fresh/clean install. Thankfully, as I noted above, that’s done very easily with the way modern Blizzard games are structured.
Edit: Oh, and there is in fact a post-patch/install “reclaiming disk space” phase. Want to know what it’s doing? It’s literally writing files anew, while retaining the fragmentation. Yay extra write cycles for no net gain since, you know, all SSD controllers have built in garbage collection.
loading screens seems to have a mind of their own.
i take a wp from rogue encampment to dark forest, the load screen is a flash on the screen, but then i take a wp from dark forest to harrogath and it has to load for like 10 seconds. its mostly between acts for me, but once i am on an act the wps there work like the first example i gave.
sometimes if its really bad the loading screen will be like 15-20 seconds.
it hasn’t caused a death for me yet, but yeah i can imagine stepping through a portal and suffering a 10 second load screen could spell doom for your character.
Just wanted to confirm that after struggling from massive load times (up to 1 minute to start a new game for example) I have tried out the uninstall then reinstall from bnet… and it worked!
Load times have been reduced significantly. The game is totally playable now and I don’t even have a good pc.
I’m guessing Blizz is just expecting their playerbase to rage uninstall their crap regularly enough, and then hop right back in, whenever the new hype train gets going.
uninstall maphack and other 3rd party hacks. As you try loading, blizz is searching in your computer for these things and as they discover them they try to block their use, increasing load times.
Does this mean that D2R support, and therefore patches, drying up would have a “silver lining” in that the issue doesn’t happen unless they push a patch, so if they don’t patch for say, 2 years, you wouldn’t have to reinstall to retain performance?