they are mathematically the same overall damage increase; the illusion of choice.
is this intentional?
Wait, not sure I am following your question. Bonus to critical strike damage would be a bonus to the critical strike damage. While a bonus to land a critical strike only increases the chance to crit but not the actually damage the crit would cause.
They are not the same mathematical damage increase, you did the math incorrectly.
I would like to see the math on it, but I can maintain like 80%+ Crit most of the time, so no real reason for me to push it anymore when almost all my spells will crit so the damage is better.
I am guessing the math involved takes in the % chance to crit increase versus the damage done increase. The roughly 3-4% increase in crits amounts to more than the 4% increase to actual damage, assuming this is what you are implying with whatever math you are doing.
how do you even come to this conclusion when at a glance you can tell that crit has a cap and is useless after 100% lol
Getting over 100%+ crit during combat through passives is not increasing your damage at all
Edit: Why is this video so dang big lol? They change something on the forums? I don’t remember them being this big before.
To be fair, some classes can benefit from crit above the cap.
Assuming hes talking about Mage and is Fire, he could be going the crit mage build. Combust gains Mastery = half Crit rating, so it could add combat increases depeneding on the spec. I am doubtful though that is what OP is going to mention though.
Yea i’ve been wanting to try the 100% crit fire build but even after you get to 100% you then would want masterful because of blaster master/combust interactions as well, stacking more crit after 100% would not be very good
if you had a 10% chance to crit, increasing your “crit from all sources” by 10% would give you 1 more percentage point of crit. If you did 100 dps, 1 more percentage point of crit would increase your dmg by 1 dps (1%), if you had no other class/azerite/etc factors. right?
if you had a 10% chance to crit, increasing your crit dmg by 10% would make 10% of your attacks do 110% extra dmg instead of 100% extra. 10% extra damage on 10% of your strikes is 1% extra dps overall.
This game isn’t a vacuum where people have flat stats and are only able to pick between 2 corruptions
Toons are able to easily reach 100% crit during combat when they are a spec that stacks it through trinkets/passives, 12% more chance to crit will do nothing in cases where you have capped on it, but 4% more Crit dmg will.
Strikethrough becomes better when you have Severes, they don’t stay the same
I agree, but on a classless character with no buffs/azerite, using some sequence of attacks, with no other stat increases, the choice between “crit from all sources” and “+crit damage” is pointless because the increase to damage is the same while under 100% crit.
but why would you argue for it then?
the situation you are describing doesn’t exist. in a real scenario +crit dmg is better once you are reaching the crit chance cap.
It seems kind of pointless to try and say that they are both the same increase when they are not the same in actual real situations
One of them increases your “chance on crit” procs, the other doesn’t.
Depends on the class and spec, like for Feral I need only 52% crit for Incarnation and Shred spam as it will always crit on whatever I hit. At that point I would want the crit damage corruption.
No, they are not mathematically the same overall damage increase because there are many specs that interact very, very differently with crit than others. Your math sucks.
yeah I can kinda see what he means. He’s just saying that “+crit dmg” and “+crit from all sources” would produce the same increase in damage, if it weren’t for class procs and effects.
Did you engage in pvp recently? Because critics do less damage in pvp than in pve
if you had a baseline of 10% crit chance and 100% crit damage, 10% crit DMG would be equivalent of EP value – excluding any nuanced class/spec mechanics that depend upon one or the other unequally – however, adding 1% crit RATE onto 10% crit rate is NOT a net increase of 1% DPS; for 1% additional crit RATE to increase dps by 1% (relatively speaking, not absolutely) you would need to have previously had 0% cri rate.
Regardless of spell power used, though we will use 1,000 spell damage/intellect and a hypothetical spell with a co-efficient of 100%, the base non-crit damage would obviously be 1,000; factoring in the base 30% crit RATE brings that up to 1,300; adding the additional 1% crit rate brings the average up to 1,310; thus, we do: 1,310 / 1,300.
ProTip: i’ve learned one of the best ways to grasp trends when comparing 2 or more scaling variables, the simplest way to wrap your head around what’s going on to the point where you can make predictions is to INCREASE THE CONTRAST. This is especially true when trying to assign a value to 2 items you are comparing in the game, or any other time you are comparing relatively minor differences that you must decide between: someone asked me, “item 1 has 5 mp5, and item 2 has 10 spell power. which should i go with?”
*IMPORTANT: when using this method, you must be fully aware of any thresholds that would fundamentally affect the value of any or all the variables you are comparing. In this case, MP5 would obviously be meaningless if you were able to obtain the maximum amount of mana you could use in 5 seconds as mp5. You must also consider the rate of diminishing returns – or magnifying aka potentiating returns – as it is in the case of mp5 here. Whereas most people would think the more mp5 you have, the lower its EP value, because with almost every other stat that is true. When trying to assign value to mp5 or any regeneration stat, how much you actually have is really kind of irrelevant - what is relevant is MANA USED PER 5 - mp5 = net loss. Say somehow in some scenario you used on average 1,000 mana per 5 seconds, and your regen was 980, and you had an item that could give you an extra 10, putting it at 990. This absolute increase of 1% (98% to 99% effective mitigation of avg mana used) would have the result of DOUBLING how long your could effectively cast before being limited by a lack of mana!!! This concept applies with TANK MITIGATION as well. The absolute increase of 1% from 98% to 99% mitigation is a relative damage reduction of 50% – from taking 2 of a 100 dmg hit (98% mitigation) vs taking 1 out of a 100 dmg hit (99% mitigation). This means that the 1% from 98% to 99% mitigation would have the same relative survival benefit of the first 0-50% (that is, doubling the time it takes to kill you, or 2x survival duration if no heals or regen allowed)!
if you had a baseline of 10% crit chance and 100% crit damage, 10% crit DMG would be equivalent of EP value – excluding any nuanced class/spec mechanics that depend upon one or the other unequally – however, adding 1% crit RATE onto 10% crit rate is NOT a net increase of 1% DPS; for 1% additional crit RATE to increase dps by 1% (relatively speaking, not absolutely) you would need to have previously had 0% cri rate.
Regardless of spell power used, though we will use 1,000 spell damage/intellect and a hypothetical spell with a co-efficient of 100%, the base non-crit damage would obviously be 1,000; factoring in the base 30% crit RATE brings that up to 1,300; adding the additional 1% crit rate brings the average up to 1,310; thus, we do: 1,310 / 1,300.
ProTip: i’ve learned one of the best ways to grasp trends when comparing 2 or more scaling variables, the simplest way to wrap your head around what’s going on to the point where you can make predictions is to INCREASE THE CONTRAST. This is especially true when trying to assign a value to 2 items you are comparing in the game, or any other time you are comparing relatively minor differences that you must decide between: someone asked me, “item 1 has 5 mp5, and item 2 has 10 spell power. which should i go with?”
IMPORTANT: when using this method, you must be fully aware of any thresholds that would fundamentally affect the value of any or all the variables you are comparing. In this case, MP5 would obviously be meaningless if you were able to obtain the maximum amount of mana you could use in 5 seconds as mp5. You must also consider the rate of diminishing returns – or magnifying aka potentiating returns – as it is in the case of mp5 here. Whereas most people would think the more mp5 you have, the lower its EP value, because with almost every other stat that is true.
When trying to assign value to mp5 or any regeneration stat, how much you actually have is really kind of irrelevant - what is relevant is MANA USED PER 5 - mp5 = net loss. Say somehow in some scenario you used on average 1,000 mana per 5 seconds, and your regen was 980, and you had an item that could give you an extra 10, putting it at 990. This absolute increase of 1% (98% to 99% effective mitigation of avg mana used) would have the result of DOUBLING how long your could effectively cast before being limited by a lack of mana!!! This concept applies with TANK MITIGATION as well. The absolute increase of 1% from 98% to 99% mitigation is a relative damage reduction of 50% – from taking 2 of a 100 dmg hit (98% mitigation) vs taking 1 out of a 100 dmg hit (99% mitigation). This means that the 1% from 98% to 99% mitigation would have the same relative survival benefit of the first 0-50% (that is, doubling the time it takes to kill you, or 2x survival duration if no heals or regen allowed)!