Remove or redesign Area Damage

Hey Darth!

Sort qualifying array of enemies in clip array by distance from center (if this is an available data point) and keep only first 5 elements.

In my world, keeping the first 5 looks like this: array_slice($enemies, 0, 5);

Only they would know what is in the realm of possible.

2 Likes

hmmm thats definitely an interesting way to do it.

1 Like

The question is how many calculations is this in total, also when compared to the actual amount of calculations that are coming from uncapped Area Damage?

Nope
Not even close.

AD scales expot. wich means, if you would try to compensate it with pure raw-dmg you would have waaaaaay to much on singletargets or smaler groups, while missing out on huge packs.

looks more, like it would increase the calculation as that it would reduce it.

at the moment (as far as i know) ad checks the position of every enemy first, than it calculates the damage and after that another position-check is made.
if they now also have to make sure, that the effekt is only applied on a certain number of targets within theyr radius as just everything within it, it would make it just worse.

Balance needs to be reached ofc. But most agrees nerfing some heavily AD reliant bds while buffing others for diversity is welcomed. To max AD, zbarb skills are crucial. Majority of zbarbs lack those skills outside of elite zbarb categories.

AD formular is similar to (1+ad%)*no of mobs(10 yard radius)*20% proc chance.

The calculation strain on server is too heavy, especially with AD and DoT interactions. DoT change is too complicated. AD calc is much simpler to control.

1 Like

But should this, a mechanic like Area Damage which scales up exponentially with more enemies (as opposed to ordinary Area of Effect attacks, which just scales up linearly) even be a thing in the first place?

I would argue that removing Area Damage makes it easier to balance skills, items and builds and also environments and density, due to exponentially scaling damage not being a factor anymore.

1 Like

Yes – the only way to know is to actually write the server/client code and test it. Some calcs are less impactful than others, sorting small arrays is usually not resource intense, but no one will ever know without setting it up and running stress tests on it.

1 Like

Hy , PTR FOLK. ~ !!! ~ Great post i like it be a sticky note in this forum ~ !!! ~

So my ideas to change this problem this sasion in short term is to change the meta by rework some of the fundemetal gemstone of the game, to make them much stronger game changer. Al would agree with me that PAth of exile is a exelnt example, how this would affect the meta.

Removing Area damage very soon??? I personally don’t think is great cause some guys are would be very disappointed. !?
:wink:
Rather than that Me would see more people able to push the leaderboard with empoered skills that aren’t laggy, or not needed heavy bitrate calculation involved, this is best i can recommend to dodge this issue with the system until the developer’s department get an deeper more advanced solution of how make the system and network performances stronger. ~!!!~

Hy , PTR FOLK. ~ !!! ~ Me Again. ~ :sunny: ~

Also for make quality of life improvement in Dia3Ros.
is for me while we are able to testing on the Ptr server to remove the “normal Quality Legendary” , means we will get balanced quantity of yellow items to reroll our good!!! ancient and maybe primals we got ~ :sunny: ~
also make the game more enjoyable again , cause when a ledgy drop u already now that is an goldish one ~ :sunny: ~

And maybe consider an New Tab for “Paragon” points where let “Magicfind” bonus give a new revival.
I mean u can easily, make code for where we granted the skill, customize what specific items affinity we wanna farm. and then remove “Kadala” Completely from the game ^^ :stuck_out_tongue_winking_eye: ^^

And Finally when most ranked player are able to use fully augmented Primals, we have less RNG in the ladder , what could means the ranked tiers are more compareable to the actally performace the players has shown ~ :sun_with_face: ~

So lots thinks u can talk about , and the game can be greater again. Thanks for reading . cu in the rifts. ^^
:love_letter:

JanWar

The only ones that are pixels in this list are stomp necro and twister with bracers only for lesser enemies …

honestly though every class “can” pixel …

someone tell me why.

Sounds good. I’ve never once considered area damage on anything.

/signed if it will help others

I go for redesign, maybe it made sense at the time but there’s no reason why it should be overly complicated. It’s like AD proc by yards by enemies minus a single one by area damage…crazy amount of cals.

You could simply tone it down to say 5 enemies xx spell hits = xx damage % and every 5 enemies after that = xx damage %. That would tone down calculations a lot and help with lag while not nerfing the concept completely.

I think that would even cause more calculations, assuming I understood your idea correctly.

The largeness of the damage number is not important, as for example 293 damage can translate into
000111001010101010110001101110000101010111110 and 47.256.286.252.952.473
damage can translate into 110010101100011110001110000111011000100111101
for the CPU (just some random examples to illustrate the point).

What is important are the amount of calculations that have to be made per second, and the only way to reduce these is to either remove Area Damage completely or to redesign it somehow so that it does not cause so many calculations to begin with.

Erm…

293 decimal = 100100101 binary

000111001010101010110001101110000101010111110 binary = 3,939,931,458,238 decimal

I thought they are all made of either 32 or 64 digit numbers, depending on the system you use. At least that was how it was explained to me.

Do you know how the decimal system work ? The decimal system is what you usually use (0-9).
It goes 0. 1. 2. 3… 9. Then for “ten”, it’s 1×(ten)+0. Eleven is 1×(ten)+1. For (one hundred), it’s 1×(ten×ten)+0+0. And so on.

It works the same way for binary, but instead of going 0, 1, 2… 9, it just goes 0, 1, “two” (10 => 1×"two" + 0). Three (11) is 1×(two)+1, and “four” is 1×(two×two)+0+0 (100). As you can see the math adds up.

You can represent any number with any number of digit in any base. In base 10, you can represent 24 with 32 digits if you want : 00000000000000000000000000000024. As you can see, a few of these are unnecessary.

What it means for a computer to be 32bits or 64bits is its insides (registers) can hold up to 32 or 64 (or more). So ofc, a computer, to do 36+24, would have its registers set to
00000000000000000000000000000036
+00000000000000000000000000000024
=00000000000000000000000000000060
Well, of course on a computer the registers are binary, so it would be the binary representation of 36, 24 and 60 but I can’t be bothered.

It goes without saying you can calculate values above 2^32 on 32 bits computer, but it takes more time because the computer can’t do it with a single operation.
Imagine having to do 2434 + 6121, mentally. I don’t know how you do it but personnally I go 4+1 = 5, 3+2 = 5, 4+1 = 5, 2+6 = 8 => sum is 8555. You added up each digit on its own but still came up with the right result.

Yes, I do, but I do not know how computers calculate these informations.

What I can tel you is that now I have conflicting information on how it is being done.
Either every number, regardless of how small it is gets translated into a 32 or 64 digit number, or smaller numbers creating a shorter digit number.

Or in the context that was it was mentioned it was implied that reducing the damage would not matter, because the number still would be so high that it would still would translate into a 32 or 64 digit number anyways and therefore not having much of an effect.

Well going from my example, for a computer, calculating
00001000000000000000000000000000
+00000100000000000000000000000000
=00001100000000000000000000000000
Takes the same amount of time as
00000000000000000000000000010000
+00000000000000000000000010000000
=00000000000000000000000010010000
Even though the first value is much larger.

D3 doesn’t use int but use float instead for the calculations afaik, their representation is bit different but the idea is still the same.

1 Like

Yes, that is what I mean.
afaik it doesn’t matter if it is 300 Billion Damage that needs to be calculated or just 300 Damage.

What matters is the amount of calculations, not the largeness of the number. At least that is how it was explained to me.

1 Like

Awesome Windforce, too bad it will never happen…