The question is how many calculations is this in total, also when compared to the actual amount of calculations that are coming from uncapped Area Damage?
AD scales expot. wich means, if you would try to compensate it with pure raw-dmg you would have waaaaaay to much on singletargets or smaler groups, while missing out on huge packs.
looks more, like it would increase the calculation as that it would reduce it.
at the moment (as far as i know) ad checks the position of every enemy first, than it calculates the damage and after that another position-check is made.
if they now also have to make sure, that the effekt is only applied on a certain number of targets within theyr radius as just everything within it, it would make it just worse.
Balance needs to be reached ofc. But most agrees nerfing some heavily AD reliant bds while buffing others for diversity is welcomed. To max AD, zbarb skills are crucial. Majority of zbarbs lack those skills outside of elite zbarb categories.
AD formular is similar to (1+ad%)*no of mobs(10 yard radius)*20% proc chance.
The calculation strain on server is too heavy, especially with AD and DoT interactions. DoT change is too complicated. AD calc is much simpler to control.
But should this, a mechanic like Area Damage which scales up exponentially with more enemies (as opposed to ordinary Area of Effect attacks, which just scales up linearly) even be a thing in the first place?
I would argue that removing Area Damage makes it easier to balance skills, items and builds and also environments and density, due to exponentially scaling damage not being a factor anymore.
Yes â the only way to know is to actually write the server/client code and test it. Some calcs are less impactful than others, sorting small arrays is usually not resource intense, but no one will ever know without setting it up and running stress tests on it.
Hy , PTR FOLK. ~ !!! ~ Great post i like it be a sticky note in this forum ~ !!! ~
So my ideas to change this problem this sasion in short term is to change the meta by rework some of the fundemetal gemstone of the game, to make them much stronger game changer. Al would agree with me that PAth of exile is a exelnt example, how this would affect the meta.
Removing Area damage very soon??? I personally donât think is great cause some guys are would be very disappointed. !?
Rather than that Me would see more people able to push the leaderboard with empoered skills that arenât laggy, or not needed heavy bitrate calculation involved, this is best i can recommend to dodge this issue with the system until the developerâs department get an deeper more advanced solution of how make the system and network performances stronger. ~!!!~
Also for make quality of life improvement in Dia3Ros.
is for me while we are able to testing on the Ptr server to remove the ânormal Quality Legendaryâ , means we will get balanced quantity of yellow items to reroll our good!!! ancient and maybe primals we got ~ ~
also make the game more enjoyable again , cause when a ledgy drop u already now that is an goldish one ~ ~
And maybe consider an New Tab for âParagonâ points where let âMagicfindâ bonus give a new revival.
I mean u can easily, make code for where we granted the skill, customize what specific items affinity we wanna farm. and then remove âKadalaâ Completely from the game ^^ ^^
And Finally when most ranked player are able to use fully augmented Primals, we have less RNG in the ladder , what could means the ranked tiers are more compareable to the actally performace the players has shown ~ ~
So lots thinks u can talk about , and the game can be greater again. Thanks for reading . cu in the rifts. ^^
I go for redesign, maybe it made sense at the time but thereâs no reason why it should be overly complicated. Itâs like AD proc by yards by enemies minus a single one by area damageâŚcrazy amount of cals.
You could simply tone it down to say 5 enemies xx spell hits = xx damage % and every 5 enemies after that = xx damage %. That would tone down calculations a lot and help with lag while not nerfing the concept completely.
I think that would even cause more calculations, assuming I understood your idea correctly.
The largeness of the damage number is not important, as for example 293 damage can translate into
000111001010101010110001101110000101010111110 and 47.256.286.252.952.473
damage can translate into 110010101100011110001110000111011000100111101
for the CPU (just some random examples to illustrate the point).
What is important are the amount of calculations that have to be made per second, and the only way to reduce these is to either remove Area Damage completely or to redesign it somehow so that it does not cause so many calculations to begin with.
Do you know how the decimal system work ? The decimal system is what you usually use (0-9).
It goes 0. 1. 2. 3⌠9. Then for âtenâ, itâs 1Ă(ten)+0. Eleven is 1Ă(ten)+1. For (one hundred), itâs 1Ă(tenĂten)+0+0. And so on.
It works the same way for binary, but instead of going 0, 1, 2⌠9, it just goes 0, 1, âtwoâ (10 => 1Ă"two" + 0). Three (11) is 1Ă(two)+1, and âfourâ is 1Ă(twoĂtwo)+0+0 (100). As you can see the math adds up.
You can represent any number with any number of digit in any base. In base 10, you can represent 24 with 32 digits if you want : 00000000000000000000000000000024. As you can see, a few of these are unnecessary.
What it means for a computer to be 32bits or 64bits is its insides (registers) can hold up to 32 or 64 (or more). So ofc, a computer, to do 36+24, would have its registers set to
00000000000000000000000000000036
+00000000000000000000000000000024
=00000000000000000000000000000060
Well, of course on a computer the registers are binary, so it would be the binary representation of 36, 24 and 60 but I canât be bothered.
It goes without saying you can calculate values above 2^32 on 32 bits computer, but it takes more time because the computer canât do it with a single operation.
Imagine having to do 2434 + 6121, mentally. I donât know how you do it but personnally I go 4+1 = 5, 3+2 = 5, 4+1 = 5, 2+6 = 8 => sum is 8555. You added up each digit on its own but still came up with the right result.
Yes, I do, but I do not know how computers calculate these informations.
What I can tel you is that now I have conflicting information on how it is being done.
Either every number, regardless of how small it is gets translated into a 32 or 64 digit number, or smaller numbers creating a shorter digit number.
Or in the context that was it was mentioned it was implied that reducing the damage would not matter, because the number still would be so high that it would still would translate into a 32 or 64 digit number anyways and therefore not having much of an effect.
Well going from my example, for a computer, calculating
00001000000000000000000000000000
+00000100000000000000000000000000
=00001100000000000000000000000000
Takes the same amount of time as
00000000000000000000000000010000
+00000000000000000000000010000000
=00000000000000000000000010010000
Even though the first value is much larger.
D3 doesnât use int but use float instead for the calculations afaik, their representation is bit different but the idea is still the same.