PDA

View Full Version : On feats at 1st level



Mitchellnotes
2015-01-03, 09:42 AM
I love feats. Always have. I love the mechanics of them and how they add to a character. I loved them in 3.x and even more in 5e.

That being said, there is one thing about how feats are set up in 5e that makes me sad. Conventional wisdom is that you take your primary stat to 20 before taking a feat (unless you're a variant human). This means that most characters won't take their first feat until level 12. While this reduces the mechanical flexibility of classes a little bit, I think what is worse is that it reduces how individualized each character can be as well. I always, always find myself wishing I had just a little bit more space for a feat.

Clearly having a feat at first level isnt overpowered bc variant humans can get them, though they give up a bit to get one. What it also does is make Variant human the "go-to" race often as well.

Giving everyone a blanket feat at level 1 doesnt quite work either. It creats the "goal" initial stat of 18 rather than 16, which screws with balance and reduces the appeal of fighters and rogues a little bit bc an extra feat reduces the impact of the increased # of ASIs they get.

So the goal is to somehow get a feat online earlier without disrupting the balance of the game too much. Here is my proposal. Go ahead and give a bonus feat for everyone at character creation with the caveat that if using point buy, no ability score can be over 16 after racial and feat modifications. Variant human would likely need to be adapted or removed. In exchange, remove the level 19 feat from all classes. This would keep the number of ASIs and feats consistent at 20 and keep stat growth consistent throughout as well. It does reduce the impact of multiclassing a little bit, but missing a classes capstone/highest level spells should still be a large deterrent to dipping (note that the bonus feat would only be available at character creation, so wouldnt be available with a new level 1 when multiclassing).

The one instance where this would have an overall net negative impact would be with a character that wanted to use each ASI to increase ability scores by 2 each time, but even then they would miss out on 1 increase.

Kurald Galain
2015-01-03, 09:51 AM
IConventional wisdom is that you take your primary stat to 20 before taking a feat

Conventional wisdom says nothing of the sort. We've had frequent threads about this subject, and many players agree that getting a character-defining feat is both more flavorful and mechanically better than becoming 5% better at things you could already do. A common bit of advice is to not take any stat increases until you run out of good feats for your characterl; how fast that is depends on your class.

Amnoriath
2015-01-03, 09:57 AM
I don't think it is the conventional wisdom to max your stat at 20 before the feat. Getting your primary stat to 20 is indeed a goal at some point but getting the options and strategies should come first before getting your stats up, especially in Bounded Accuracy. Your rule isn't bad actually and in a way people could simply choose to create a variant human by sacrificing 2 bumps for the feat and 1 for proficiency to a skill.

AstralFire
2015-01-03, 10:11 AM
I gave everyone a feat, and gave standard humans an extra skill and background. Seems to be working out, so far.

Mitchellnotes
2015-01-03, 10:34 AM
Conventional wisdom says nothing of the sort.

"Conventional wisdom" is probably too strong a phrase, fair enough.

Z3ro
2015-01-03, 10:38 AM
Alternatively, you could try this instead:

Drop variant human, everyone can go regular human if they want. Then, instead of getting rid of the ASI at 19 (a small penalty the party may never reach) and capping stats at 16 during creation, get rid of each race's +2 stat boost. It equally affects everyone, no stat starts higher than 16, and it's balanced out by getting the feat. Depending on your players (and how much they like maxing stats) you could even make it optional for every race; trade out your +2 stat for a feat.

Rogue Shadows
2015-01-03, 10:39 AM
I just recommend giving Ritual Caster to everyone at 1st level for free. That's about all I have to say about that.

Selkirk
2015-01-03, 12:04 PM
Conventional wisdom says nothing of the sort. We've had frequent threads about this subject, and many players agree that getting a character-defining feat is both more flavorful and mechanically better than becoming 5% better at things you could already do. A common bit of advice is to not take any stat increases until you run out of good feats for your characterl; how fast that is depends on your class.

this...it's a trap with ability scores. the appeal of a maxed stat is strong but the benefit is actually very small and some feats give a +1 to a stat so it's even less of an either or.

mephnick
2015-01-03, 12:34 PM
I just give everyone an bonus feat when picking a subclass at 3 and eliminate variant human. *shrug*

I might end up limiting the bonus feat to certain choices that might not be taken otherwise, but right now I'm not seeing a problem.

Mechaviking
2015-01-03, 01:47 PM
My group are playing the rules as is and they´re great, we even have a non-variant human or two who are awesome.

Personally this is a non-issue, I agree that picking feats at 4 and 8 is a good choice since it expands on versatility, bounded accuracy takes care of the rest.

If you need a feat at 1st level pick variant human(or tell players to do so). If you want more stat points, dark vision and a bunch of other useful abilities pick another race or refluff a race.

Selkirk
2015-01-03, 01:57 PM
the more i think about it the more i'm inclined to like the idea of giving everyone a feat at first (could be 2nd level..which might make sense flavor wise, they have progressed a little bit and learned something).

the problem i have found is that characters are relatively weak-due to bounded accuracy. but i could accept that...it's that without feats most characters simply don't have that much to do in a single turn. and it seems like even low level creatures have 'feats' (goblins with attack and hide...numerous creatures with multi attacks) but pc's are basically one swing or one spell/cantrip and turn over. it just gets boring and strings out every fight needlessly. giving players feats or actions would speed things up and make things much more interesting without losing any balance.

notes-characters weakness reaches points of near absurdity...would you rather play 4 first level characters or 4 goblins?

Shadow
2015-01-03, 02:17 PM
In our game we do something similar, but it isn't a 1st level Feat.
Our game uses standard array. Variant Human isn't an option, but human gets a single +2 and three +1s, along with one extra skill or tool or saving throw proficiency.
We have also split the feat list into Major feats and Minor feats. All minor feats get a +1 to any ability score of your choice (possibly instead of the specific score listed). So the crappy feats that are really flavorful but no one likes are actually pretty desirable and competetive with the feats that people seem to drool over.

At 1st level, every character gets one Ability Score Increase, but the +2 to one ability choice is not an option. Players can use that 1st level ASI to take two +1s, or a Minor feat and a +1, or a Major feat.

odigity
2015-01-03, 03:21 PM
Why does every thread here have to start with contrarians? Sure, not everyone agrees, but many people feel (possibly the majority) that there's no benefit greater than boosting your primary stat, except when a feat is core to your basic mechanic (like Polearm Master). So it's a valid point to bring up.

As for the OPs concern, there's a simple solution: Leave Variant Human as is. Give everyone one free feat at level 1 (so Variant Human gets two), but like with Varian Human's feat, it must actually be a feat, not an ASI. That encourages people to accesorize their character rather than just bump a number. (People can still get a half-feat with a +1 ASI, but I don't see that as a big deal.)

Feldarove
2015-01-03, 04:41 PM
I think you can just give everyone a bonus feat as first level. If everyone gets it, it should be pretty balanced. If you tell people this before hand, they will then hopefully optimize this option, and make characters accordingly. If someone wants to play a class and feels there really isn't a good Feat for them, refer them to these forums.

Kurald Galain
2015-01-03, 05:39 PM
Why does every thread here have to start with contrarians?

Because people who start a thread by claiming that everyone agrees on X, or as you just assumed (based on no data whatsoever) that a silent majority will surely agree with your opinion, they deserve to get called out if they're actually mistaken on that.

odigity
2015-01-03, 06:40 PM
Because people who start a thread by claiming that everyone agrees on X, or as you just assumed (based on no data whatsoever) that a silent majority will surely agree with your opinion, they deserve to get called out if they're actually mistaken on that.

You're not accurately representing what I actually said:


Sure, not everyone agrees, but many people feel (possibly the majority) that there's no benefit greater than boosting your primary stat, except when a feat is core to your basic mechanic (like Polearm Master). So it's a valid point to bring up.

Kurald Galain
2015-01-03, 06:42 PM
You're not accurately representing what I actually said:

Yes, I am. You're assuming that the majority agrees with you, and you have provided zero data to actually support that claim.

Shadow
2015-01-03, 06:51 PM
Yes, I am. You're assuming that the majority agrees with you, and you have provided zero data to actually support that claim.

That isn't what he said. He didn't make an assumption that the majority agreed. He made an observation that many agreed, and a claim that it was possible that it was a majority.
So yes, you misrepresented what he said.

Person_Man
2015-01-03, 08:48 PM
Its my opinion that between 1st level and 11th-ish level, at-will damage (as opposed to stuff that requires limited resources or specific circumstances, like Action Surge, Assassinate, Metamagic, etc), is pretty carefully balanced between all of the classes. Feats can throw this careful balance greatly out of whack. So I'm guessing that designers intentionally created a system that discouraged players from taking Feats until higher levels (when it all starts to break down anyway, because spell uses becomes so numerous). They even explicitly made Feats optional and the Human that gets a Feat at first level a variant, so that DMs who prefer balance can keep a tighter reign on it.

Anywho, I also like Feats a lot and have no problem with lots of different options. I'm just guessing at the designer's intent.

Mitchellnotes
2015-01-03, 09:52 PM
Its my opinion that between 1st level and 11th-ish level, at-will damage (as opposed to stuff that requires limited resources or specific circumstances, like Action Surge, Assassinate, Metamagic, etc), is pretty carefully balanced between all of the classes. Feats can throw this careful balance greatly out of whack.

Thanks for your insight on this. I think this is why Im most hesitant just to grant a bonus feat. With a +2 racial and a +1 from a feat, a character could start with an 18 stat with point buy which would throw balance off a lot.

In terms of other feats, do you see most of the other damage coming from bonus action attacks/reactions? (Polearm mastery and crossbow expert (with a favorable ruling)) Those would certainly make dual wielding less desirable. It seems like at lower levels the accuracy penalty to great weapon mastery and sharpshooter would lead to swingy damage at best esp before extra attacks kick in.

ghost_warlock
2015-01-03, 10:25 PM
Thanks for your insight on this. I think this is why Im most hesitant just to grant a bonus feat. With a +2 racial and a +1 from a feat, a character could start with an 18 stat with point buy which would throw balance off a lot.

Perhaps anecdotal, but in the group I'm DMing, I gave everyone a feat at 1st and one of the players used that to get an 18 to start. It hasn't really been noticeable at all. The player was even complaining a bit last session that he kept missing with his attacks because of bad luck with the dice.

If a player could find a way to consistently get and keep advantage, that would have a noticeable effect, but a simple +1 doesn't in my experience.

odigity
2015-01-03, 11:24 PM
Thanks for your insight on this. I think this is why Im most hesitant just to grant a bonus feat. With a +2 racial and a +1 from a feat, a character could start with an 18 stat with point buy which would throw balance off a lot.

I don't see why this is bad. If you roll, you can start with a 20 (roll 18, pick race with +2). The game is designed to handle that already.

As long as all players have the same options/benefits, the balance is usually fine.

Kurald Galain
2015-01-04, 06:14 AM
If a player could find a way to consistently get and keep advantage, that would have a noticeable effect, but a simple +1 doesn't in my experience.

This bears repeating. A +1 is the very smallest bonus that can exist in the game; so getting a +1 or -1 to anything is really not a big deal. Some people feel it should be, but realistically or mathematically it's clearly not.

odigity
2015-01-04, 09:58 AM
This bears repeating. A +1 is the very smallest bonus that can exist in the game; so getting a +1 or -1 to anything is really not a big deal. Some people feel it should be, but realistically or mathematically it's clearly not.

That's technically a non-sequitar. Just because it's the smallest bonus, doesn't mean it's not a big deal.

Rogue Shadows
2015-01-04, 10:01 AM
That's technically a non-sequitar. Just because it's the smallest bonus, doesn't mean it's not a big deal.

It is a 5% increase. It is not nothing, but no, it really isn't a big deal.

Feldarove
2015-01-04, 10:29 AM
I know this is maybe not the best way to look at it, but it is actually the way that a lot of us view it (without doing the math).

What you are all saying "+1 is a small thing. 1 out of 20 is 5%. Who cares?"

Now, whether or not someone cares from that perspective isn't what I am getting at, but I will address it.

But its kinda like saying "I only got a 2,000 dollar raise this year". That might not seem like a lot for you now, but its effects stack later on. If you work for 5 years getting a 2,000 dollar raise each year, well, now you make 10,000 dollars more. Which say you started at 30,000 a year, and you are now at 40,000 a year, look almost like a big promotion.

D&D isn't as stringent as work raises, but it can feel like it at times. How long are you going to be at your company? How long are you going to be playing that character? What other benefits are offered at your company? What magical items are you going to receive? What are some perks of the job that offset your salary? What are some feats/class features that offset low stats?

But what I am really trying to get at is....

When you look at raising your ability score by 1, you aren't thinking outs its a 5% increase.

Let's say you're a wizard and you did an array and are playing a human, and you have a 16 intelligence. You want to get to 20. You need 4 points to get there. A stat increase of 1 feels like a 25% boost, because you're a quarter of the way done getting to 20. This isn't how that 1 increase affects your statistical rolling, its how you perceive the progression of your character. As someone said, you might have a 20 intelligence and miss on all of your firebolts for an entire session. But the assurance that the ability increase for your character is complete is a warm comfy feeling, and a lot of us like to get there as quickly as possible.

odigity
2015-01-04, 10:41 AM
It is a 5% increase. It is not nothing, but no, it really isn't a big deal.

Everything Feldarove said, plus:

It's often discussed on these forums that 5% is not the only way to look at it. See, the game is balanced at the margins. Each +1 is not a linear improvement, because it's not about having a 5% better chance -- you are far more likely to be off by 1 than off by 20. That +1, when you're in the middle of expected zone, is going to have the practical effect of improving your chances of hitting by 10-20% relative to without the +1.

I don't know the exact math, but I know it's not linear. To illustrate with an extreme example, if you already had a +20 to hit somehow, the +1 would literally add 0% chance most of the time.

Selkirk
2015-01-04, 12:04 PM
in theory over 10 combats/encounters the +1 would help but in practice it ends up 'helping' against weak mobs and encounters that the party is going to win with or without the +1 (long battles with mobs give you lots of swings). which would still be fine if you got the +1 for free but the stat increase comes at the cost of a feat...which as many have noted will give you more utility/moves than the stat increase.

stat increases are a trap and an easy one to fall into. logic and frankly ego almost demand the stat increase (who doesn't want the 20 str fighter?) and also there is a sense of closure-now i can close that 'chapter' of my char and focus on feats. but it's the backwards way of doing things-you end up with a weaker build, the +1's simply are not equivalent to bonus actions and skills...max feats first then go stat increase.

odigity
2015-01-04, 12:12 PM
in theory over 10 combats/encounters the +1 would help but in practice it ends up 'helping' against weak mobs and encounters that the party is going to win with or without the +1 (long battles with mobs give you lots of swings). which would still be fine if you got the +1 for free but the stat increase comes at the cost of a feat...which as many have noted will give you more utility/moves than the stat increase.

stat increases are a trap and an easy one to fall into. logic and frankly ego almost demand the stat increase (who doesn't want the 20 str fighter?) and also there is a sense of closure-now i can close that 'chapter' of my char and focus on feats. but it's the backwards way of doing things-you end up with a weaker build, the +1's simply are not equivalent to bonus actions and skills...max feats first then go stat increase.

I believe this is a minority opinion, though it would take a survey to prove it.

While it can be argue that feats that are core to your char's build (like Polearm Master) should have priority over stat bumps, I think you're undervaluing just how powerful a stat bump is. Especially if you're a Dex-based martial with light armor, there's almost nothing more valuable than a Dex bump. (attacks, damage, ability checks, saves, AC, init...)

Kurald Galain
2015-01-04, 12:14 PM
I believe this is a minority opinion, though it would take a survey to prove it.

It is, however, mathematically sound. The fun thing about math is that, regardless of what the majority might think, math still wins. Reality is fun like that :smallamused:

Selkirk
2015-01-04, 01:03 PM
I believe this is a minority opinion, though it would take a survey to prove it.

While it can be argue that feats that are core to your char's build (like Polearm Master) should have priority over stat bumps, I think you're undervaluing just how powerful a stat bump is. Especially if you're a Dex-based martial with light armor, there's almost nothing more valuable than a Dex bump. (attacks, damage, ability checks, saves, AC, init...)

the problem with the +1 mentality is that it makes sense (that is until you start looking at it :D). a new player looking at the champion for instance will see improved critical (crit on a 19 or a 20) and think they are critting twice as often...which is technically true but it's actually 5% more often. and of course has the downside of making nat 20's rather dissapointing-i was going to crit on that roll anyways.

and again not arguing against dex increases..but if one is concerned with init(which you should be)-alert feat gives a +5 to init :D.

odigity
2015-01-04, 01:07 PM
It is, however, mathematically sound. The fun thing about math is that, regardless of what the majority might think, math still wins. Reality is fun like that :smallamused:

You cannot possibly show math proving that an interesting feat is more valuable than a stat boost as a general case.

Mitchellnotes
2015-01-04, 01:28 PM
It is, however, mathematically sound. The fun thing about math is that, regardless of what the majority might think, math still wins. Reality is fun like that :smallamused:

You should make sure you are using the right math. A +1 increase is going to be more than a 5% chance increase of success. For instance, the difference between being successful on a 16-20 and a 15-20 is only the difference of a +1, but increases your chance of a successful roll by 20%. Now, you would be correct that the chance of rolling that 15 is only a 5% chance, but to say it only increases your chances of success by 5% isnt exactly true.

Granted, as your range of success increases, you will experience diminishing returns. The difference between an 11-20 successful roll and a 10-20 successful roll is an increase of 10% success. Again, rolling that 10 is only a 5% chance, but to say that a +1 to rolls only increases your chance of success by 5% is disingenuous.

Seeing a potential increase of success by 10-20% (at the least) across several different rolls is not a negligible increase.

If you want to say your claim is supported by math, please actually use a more appropriate percentage of success before dismissing the point entirely, especially since at low levels characters see a greater portion of thier total modifier from stats than from proficiency.

Selkirk
2015-01-04, 01:32 PM
you're right i can't but i would also argue you can't 'prove' that asi are better than feats. i mean playstyles and of course encounter design dictate results (and there is no accounting for dumb luck...hey the 20 str champion might crit twice :D). still that doesn't we can't use logic and probability and our own play experiences to tell us which is actually better. common sense tells you bonus actions and skills are better than 5% gains.

and on a separate point it's just plain more fun and interesting to play feat based characters. asi's are one time adjustments on a stat sheet...lost in a blur of rolls. feats give you the chance to actually make choices and impact combats in ways a +1 to dex does not. 5e has the illusion of 'balance' only because characters are weak and taking the +1's reinforces this weakness(truly is the mmo model of character development). giving your characters 'agency' with feats is one of the few ways you can fight bounded accuracy and give your dm hell :D.

Selkirk
2015-01-04, 01:35 PM
You should make sure you are using the right math. A +1 increase is going to be more than a 5% chance increase of success. For instance, the difference between being successful on a 16-20 and a 15-20 is only the difference of a +1, but increases your chance of a successful roll by 20%. Now, you would be correct that the chance of rolling that 15 is only a 5% chance, but to say it only increases your chances of success by 5% isnt exactly true.

Granted, as your range of success increases, you will experience diminishing returns. The difference between an 11-20 successful roll and a 10-20 successful roll is an increase of 10% success. Again, rolling that 10 is only a 5% chance, but to say that a +1 to rolls only increases your chance of success by 5% is disingenuous.

Seeing a potential increase of success by 10-20% (at the least) across several different rolls is not a negligible increase.

.

false. 11-20 is a 45% success rate. 10-20 is a 50% success rate...ergo 5% increase.

odigity
2015-01-04, 01:35 PM
and on a separate point it's just plain more fun and interesting to play feat based characters. asi's are one time adjustments on a stat sheet...lost in a blur of rolls. feats give you the chance to actually make choices and impact combats in ways a +1 to dex does not. 5e has the illusion of 'balance' only because characters are weak and taking the +1's reinforces this weakness(truly is the mmo model of character development). giving your characters 'agency' with feats is one of the few ways you can fight bounded accuracy and give your dm hell :D.

Sure. But let's not confuse preference and subjective value with optimization and objective measures.

odigity
2015-01-04, 01:37 PM
it's false. 11-20 is a 45% success rate. 10-20 is a 50% success rate...ergo 5% increase.

A 5% increase on a 45% chance of success is actually an 11% increase in success rate. The percentage context is ambiguous because there are two valid contexts to use it in, but the 11% context is more relevant, and hence the implied default. You're using the other one -- hence the communication gap.

EDIT: Another way to look at it is:

If you have a 45% hit chance, you have a 55% miss chance. If you get a +1, you've just converted 9% of your misses into hits.

Selkirk
2015-01-04, 01:39 PM
yeah i agree that's why i said side note with this point and addressed first point with qualifier that i couldn't 'prove' it was better'. i mean it's technically unprovable given the dynamics of a dm and a table of 4 players how anything is going to work out. but we must use experience and knowledge and realize that 5% increases at the cost of skills and utility are a bad choice.

odigity
2015-01-04, 01:40 PM
yeah i agree that's why i said side note with this point and addressed first point with qualifier that i couldn't 'prove' it was better'. i mean it's technically unprovable given the dynamics of a dm and a table of 4 players how anything is going to work out. but we must use experience and knowledge and realize that 5% increases at the cost of skills and utility are a bad choice.

As long as you continue to look at it as a 5% increase instead of 11%, we are not going to be able to have a productive conversation...

Selkirk
2015-01-04, 01:46 PM
As long as you continue to look at it as a 5% increase instead of 11%, we are not going to be able to have a productive conversation...
probably not...as it really is the only way i can look at it :D..my chance of rolling a 10 isn't 10% higher my chance of success is higher. my chance of rolling that 10 is the same. i think we are confusing the benefits of rolling a 10 with the chance of rolling a 10.

or maybe im just wrong...:D. could be possible as well. let me read some articles. but also the 10-11 is a skew (where it's mathematically highest and mathematically least likely). we aren't arguing `10's or 11's we are arguing to hit rolls and spell dc's which aren't 10 or 11 etc.. but enough editing-post saved.

Mitchellnotes
2015-01-04, 01:51 PM
false. 11-20 is a 45% success rate. 10-20 is a 50% success rate...ergo 5% increase.

The 10% isn't a 10% increase overall, its an increase of 10% compared to the 11-20 roll. (Someone upthread said 9%, their math is most likely more correct than mine, but the point remains the same).

Regardless, the original intent of the thread was in how to, in a balanced fashion, gain a feat earlier in a characters career without compromising stat growth. Clearly gaining a feat is an increase in power, as is increasing ability scores (i would argue that this is a greater impact than many are arguing).

I want my cake and to eat it too. The intent of this thread was to figure out how much time i need to spend on the treadmill to burn off calories, not to argue about which piece is more caloric. Granted, its an important consideration but I feel weve lost sight of the original goal here.

Dalebert
2015-01-04, 01:51 PM
In another thread, someone pointed out an interesting perspective. A +1 will be affect the game 1 out of every 20 related rolls, e.g. saving throws and to-hits. That should roughly work out to about 5% more damage from attacks. It's 2.5% more dmg from spells that do half damage on a successful save. That might be just the right amount that makes the difference sometimes. Just something to put it in perspective. Every decision represents an opportunity cost. EDIT: Oh yeah, and 1 in 20 times when a save-or-lose effects works vs. not.

That doesn't sound like much, right? Just for an example, I'll consider two of my dilemmas.

Case 1: A warlock who primarily hangs back and casts a lot of spells and EBs. I plan to take resilience CON at some point, but when? I've played about 4 games with this character and have yet to make a concentration check to maintain a spell or a CON save to resist some bad effect. My strategy generally involves hiding behind the martials and blasting. All that blasting and spell-casting from behind the ranks seems to mean 1 in 20 of all my attacks and save-inducing spells will be more frequent. I'm definitely maxing my CHA first.

Case 2: I've only played my Moon Circle druid once, but it's clear that I will be in melee a lot. Also, a common tactic will continue be to cast some spell like faerie fire that requires concentration and then shift and fight. I anticipate a lot of concentration checks. Plus the extra HP will just be handy since I pointedly have an odd CON in anticipation of this feat. I'll be taking resilience CON at 4th level.

KhorashIronfist
2015-01-04, 01:59 PM
A 5% increase on a 45% chance of success is actually an 11% increase in success rate. The percentage context is ambiguous because there are two valid contexts to use it in, but the 11% context is more relevant, and hence the implied default. You're using the other one -- hence the communication gap.


I don't see how the percentile increase relative to your previous chances of success are relevant. If I need to roll at least an 11 to hit my enemy, and I receive an additional +1 and now only need a 10 to hit, then my chances of success have increased from 45% to 50%, as was said previously this is a 5% increase, since each face of a twenty-sided die, statistically, has 1/20th of a chance to land face-up. Since we have now expanded our successful rolls to encompass one additional side of the die, this represents a 5% increase in the chances of success.

Correct me if I am wrong, but what you are saying is that because that 5% represents an increase of 11% relative to the previous chances of success...I'm sorry I actually have no idea what point you are making. Could you elaborate?

I can only assume your math is as follows - 45/5 = 9, as such this means a 5% increase is an increase of 1/9th of the value of 45% which as a percentage would be rounded off to 11%. How is that relevant? How does one experience diminishing returns in this system? Whether I have a 25% chance of success or a 75% chance of success initially on a d20 roll, a +1 modifier changes my odds by one die-face, which for a d20 equates to 5%.


You should make sure you are using the right math. A +1 increase is going to be more than a 5% chance increase of success. For instance, the difference between being successful on a 16-20 and a 15-20 is only the difference of a +1, but increases your chance of a successful roll by 20%. Now, you would be correct that the chance of rolling that 15 is only a 5% chance, but to say it only increases your chances of success by 5% isnt exactly true.

Granted, as your range of success increases, you will experience diminishing returns. The difference between an 11-20 successful roll and a 10-20 successful roll is an increase of 10% success. Again, rolling that 10 is only a 5% chance, but to say that a +1 to rolls only increases your chance of success by 5% is disingenuous.

Seeing a potential increase of success by 10-20% (at the least) across several different rolls is not a negligible increase.


I'm not arguing whether it is better to take feats than stat upgrades (though personally I would favor feats for the flavor and utility), I'm just quite baffled by your claims of statistical relevance.

Mitchellnotes
2015-01-04, 02:18 PM
I'm not arguing whether it is better to take feats than stat upgrades (though personally I would favor feats for the flavor and utility), I'm just quite baffled by your claims of statistical relevance.

Honestly, it doesnt nor was the intent of this thread meant to delve into mechanical utility. Someone upthread stated that since a +1 was only a 5% increase of success (which while on any given d20 it is, it is comparatively a much higher increase to not having the +1), mathematics supported taking feats. This was to demonstrate that they were undervaluing the impact of ability score increases.

Kurald Galain
2015-01-04, 02:20 PM
Case 1: A warlock who primarily hangs back and casts a lot of spells and EBs. I plan to take resilience CON at some point, but when? I've played about 4 games with this character and have yet to make a concentration check to maintain a spell or a CON save to resist some bad effect. My strategy generally involves hiding behind the martials and blasting. All that blasting and spell-casting from behind the ranks seems to mean 1 in 20 of all my attacks and save-inducing spells will be more frequent. I'm definitely maxing my CHA first.

Sure, boosting your cha is going to be better for this character than the Resilient feat... but is it really going to be better than every single other feat in the game? I seriously doubt that.

Stella
2015-01-04, 02:31 PM
That's technically a non-sequitar. Just because it's the smallest bonus, doesn't mean it's not a big deal.

It is a 5% increase. It is not nothing, but no, it really isn't a big deal.
And the actual answer is: It depends on the situation.


I don't know the exact math, but I know it's not linear. To illustrate with an extreme example, if you already had a +20 to hit somehow, the +1 would literally add 0% chance most of the time.
The math is fairly simple to calculate.
Need a 20? +1 doubles your chances, so a 100% increase.
Need a 19? +1 increases your chances by 50%.
Need an 18? +1 increases your chances by 33%.

Etc.

When you're hitting/saving/whatever on a 5, a +1 doesn't give you much improvement.
But when you've got a ~25-30% chance or less, a +1 is indeed fairly significant.


the problem with the +1 mentality is that it makes sense (that is until you start looking at it :D). a new player looking at the champion for instance will see improved critical (crit on a 19 or a 20) and think they are critting twice as often...which is technically true but it's actually 5% more often.
This is incorrect. They will be critting 100% more often.

Dalebert
2015-01-04, 02:34 PM
Sure, boosting your cha is going to be better for this character than the Resilient feat... but is it really going to be better than every single other feat in the game? I seriously doubt that.

Considering that Resilience is at the top of my list of desired feats for this particular character, that's a definite yes. You seem to be taking a subjective decision that depends highly on context--the type of character, your character concept, the type of situations your DM puts you in, etc. and applying a sweeping judgment. Are you suggesting that feats are always better than the stat boost, and for every character?

Kurald Galain
2015-01-04, 02:46 PM
Considering that Resilience is at the top of my list of desired feats for this particular character, that's a definite yes. You seem to be taking a subjective decision that depends highly on context--the type of character, your character concept, the type of situations your DM puts you in, etc. and applying a sweeping judgment. Are you suggesting that feats are always better than the stat boost, and for every character?

Wait, so are you claiming that there is only one good feat for warlocks (i.e. Resilience), or that there are in fact numerous good feats but you just don't like those?

Selkirk
2015-01-04, 02:48 PM
This is incorrect. They will be critting 100% more often.

i was wrong :smallfurious:...:D http://theiddm.wordpress.com/2012/08/07/guest-post-the-power-of-1/. but i still don't get it and this is what i don't get/disagree with. they actually aren't critting 100% more(or even double). they are critting 5% more. the 20 is a crit regardless of whether they took improved crit or not. baking in given hit rolls ignores what would happen without the increase. but i'm not good at math :D.

or perhaps better question-doesn't this make feats and bonus actions even more powerful. polearm master gives bonus action (d4 admittedly :smalleek:) but also aoo. would i rather have a +1 to str or a bonus action and aoo. the utility of movement and action is highly underrated. wouldn't 2 attacks be better than 1 (even at a 10% bonus or 8.33% or whatever). what's the math on this?

KhorashIronfist
2015-01-04, 02:59 PM
The math is fairly simple to calculate.
Need a 20? +1 doubles your chances, so a 100% increase.
Need a 19? +1 increases your chances by 50%.
Need an 18? +1 increases your chances by 33%.


This is the true in the specific case of critical hits, yes. This doesn't represent diminishing returns, just one way of interpreting the numbers. As your chances of a critical hit grow larger, the addition of another +1 to your crit range is a lesser improvement relative to your newly increased odds of a crit - but how is that relevant? Each expansion is still another face of the die (each face 5% likely to be the result) that represents a critical hit, and therefore a 5% increase in the chances of a crit.

You could make the same argument for hit dice. As your max hp grows larger, the new hp you receive each level is an exponentially smaller percentage of your total hp.

When you reach level two, your hit die goes from 1d10 to 2d10. Why, that's a 100% increase! But what's this, at level three my number of hit dice only increases by 50%, from 2d10 to 3d10? And at level four it only increases by 33%! Are you therefore receiving less hp with each proceeding level? Of course not, you're still getting d10+con mod at each new level. The percentage that is of your total hp is completely irrelevant, otherwise to 'optimize' a character the size of your hit die would be completely irrelevant because it would quickly become an insignificant increase relative to what you had.

If you happened to find a hundred dollar bill on the ground one day, would that make you less inclined to pick up a five dollar bill you found later? I should think not.



When you're hitting/saving/whatever on a 5, a +1 doesn't give you much improvement.
But when you've got a ~25-30% chance or less, a +1 is indeed fairly significant.

But if you did roll a 4, you'd be glad you had that +1, wouldn't you? A +1 is equally significant at any point in the game, since the random nature of dice could at any point make that +1 the difference between failure and success.


This is incorrect. They will be critting 100% more often.

Both are true. It is 5% more likely that you will crit on 19-20 than just 20, but since the original chance of critting on 20 alone was only 5%, this represents an increase of 100% but only relative to the previous chances of it happening. It just happens to be double.

This is an oft-had problem in this thread, it seems. Confusing different percentages derived from the same base numbers.

Dalebert
2015-01-04, 03:09 PM
Wait, so are you claiming that there is only one good feat for warlocks (i.e. Resilience), or that there are in fact numerous good feats but you just don't like those?

How could you conclude such a straw man if you read my rather short post? I specifically referred to a "list of desired feats". What I clearly said is none of those feats is more valuable to me than maxing my CHA first based on this particular character and his general circumstances. CHA is used for my attacks, the save DC of my spells, and many of my skills for a character who is a charlatan. It's almost constantly relevant and therefore very valuable to me. Once my CHA is maxed, at level 9 since I dipped, I suspect the rest of my choices will be feats, starting with resilience in CON.


...i still don't get it and this is what i don't get/disagree with. they actually aren't critting 100% more(or even double). they are critting 5% more. the 20 is a crit regardless of whether they took improved crit or not. baking in given hit rolls ignores what would happen without the increase. but i'm not good at math :D.

Depends on how you look at it. Yes, you're critting (almost) twice as often as you were critting before, because you were critting 5% of the time and now you're critting (almost) 10% of your attacks. It's a weird game of semantics really.


or perhaps better question-doesn't this make feats and bonus actions even more powerful. polearm master gives bonus action (d4 admittedly :smalleek:) but also aoo. would i rather have a +1 to str or a bonus action and aoo. the utility of movement and action is highly underrated. wouldn't 2 attacks be better than 1 (even at a 10% bonus or 8.33% or whatever). what's the math on this?

Well, firstly, you might encounter something where you miss with a natural 19. Not often, I hope! Disregarding that, you're chance of critting with one attack or the other with polearm master is 1 - (.95 x .95) = .0975 or almost 10% of the time without improved crit. If you crit on a 19 or 20 WITH polearm master, the odds are 1 - (.9 x .9) = .19 or almost 20% of the time.

Basically, your odds of critting with one attack OR the other is the inverse of the odds of failing to crit with both. So about 20% of the time, you would crit with one or the other, assuming you hit at all with the 19s, of course.

Keep in mind though, that half of those crits are just an extra d4. They're not of equal value to doubling your chance of critting with your stronger attacks.

Selkirk
2015-01-04, 03:12 PM
for those that think improved crit is a 100% increase...i don't know what to say. this is mmo character building without mmo movement. and i think misses the ability that feats provide to give action and determine whether or not you even get a chance to hit. winning init/having aoo's/or reactions gives the player will and action.

again curious as to the math on the ability to attack twice vs asi increase. or i guess case specific what's the math on +5 to init from alert? never being surprised isn't quantifiable of course.



Depends on how you look at it. Yes, you're critting (almost) twice as often as you were critting before, because you were critting 5% of the time and now you're critting (almost) 10% of your attacks. It's a weird game of semantics really.

sorry got ninja'ed on the polearm master math (still looking it over) but the question is either or. if you took polearm master you wouldn't take asi. and the crit 'increase' ignores the fact that you were going to crit on a 20 regardless of whether you took improved or not. kobold crits on a 20 and 4th level champion crits on a 20.

Kurald Galain
2015-01-04, 03:17 PM
How could you conclude such a straw man if you read my rather short post? I specifically referred to a "list of desired feats". What I clearly said is none of those feats is more valuable to me than maxing my CHA first based on this particular character and his general circumstances. CHA is used for my attacks, the save DC of my spells, and many of my skills for a character who is a charlatan. It's almost constantly relevant and therefore very valuable to me. Once my CHA is maxed, at level 9 since I dipped, I suspect the rest of my choices will be feats, starting with resilience in CON.

Yes, so are you talking about (1) feats that are more flavorful than an attribute boost, (2) feats that are mechanically better than an attribute boost, or (3) feats that you personally prefer to an attribute boost? The first is trivially true, since feats have flavor and attribute boosts do not. The second is what the thread has been discussing so far. Yet you appear to be talking about the third, and there's no real point in arguing about personal preferences (chocolate!)

As to the second, yes, I do claim that for practically every build in the game, there are 2-3 feats (or more!) that are mechanically better than an attribute boost, and of course this number will likely go up as splatbooks are printed. And by definition, feats are more flavorful than attribute boosts anyway. Therefore, I do claim that conventional wisdom should be that if feats are allowed, you should only take an attribute boost after you've got a couple of feats first.

KhorashIronfist
2015-01-04, 03:20 PM
Well, firstly, you might encounter something where you miss with a natural 19. Not often, I hope! Disregarding that, you're chance of critting with one attack or the other with polearm master is 1 - (.95 x .95) = .0975 or almost 10% of the time without improved crit. If you crit on a 19 or 20 WITH polearm master, the odds are 1 - (.9 x .9) = .19 or almost 20% of the time.

Basically, your odds of critting with one attack OR the other is the inverse of the odds of failing to crit with both. So about 20% of the time, you would crit with one or the other, assuming you hit at all with the 19s, of course.


Not to be rude but this is all nonsense, your chances of critting each attack are 95% without improved crit and 90% with improved crit respectively, and that's the end of it. Each roll is made independently and does not affect any other roll.

What you're saying is true within the field of statistics, I'm sure, but that's not really relevant to D&D. You could take improved crit and just happen to never roll a 19, and it would be a wasted feat, regardless of the "objective" value of the ability.

As an example the other day I was playing a battlemaster fighter, and I employed a superiority die to use a feint attack, granting me advantage. I rolled my first die and it came up twenty, so I didn't even bother rolling the second die. That was a wasted superiority die.

(Not to say I think improved crit is bad. Just the opposite, in fact.)

Dalebert
2015-01-04, 03:21 PM
for those that think improved crit is a 100% increase...i don't know what to say. this is mmo character building without mmo movement. and i think misses the ability that feats provide to give action and determine whether or not you even get a chance to hit. winning init/having aoo's/or reactions gives the player will and action.

again curious as to the math on the ability to attack twice vs asi increase. or i guess case specific what's the math on +5 to init from alert? never being surprised isn't quantifiable of course.

Again, I see this trend of trying to consider feats in a sweeping manner without looking at context. These choices are all highly subjective depending on the particulars.

A caster who's typically blasting from the back ranks isn't going to get many AoO. He's not going to care about extra melee attacks, because he hardly ever melees unless he must.

A rogue with a high DEX has probably dumped STR to fight with finesse weapons and won't want polearm master. He may not value extra attacks that don't stack as well with his cunning actions feature. A support character like a bard or cleric may hardly ever be making attacks at all and be primarily using save-based effects on the rare cases when they do.

Reserve judgment on other people's choices until you hear their particular reasoning.

Stella
2015-01-04, 03:21 PM
I don't see how the percentile increase relative to your previous chances of success are relevant. If I need to roll at least an 11 to hit my enemy, and I receive an additional +1 and now only need a 10 to hit, then my chances of success have increased from 45% to 50%, as was said previously this is a 5% increase, since each face of a twenty-sided die, statistically, has 1/20th of a chance to land face-up. Since we have now expanded our successful rolls to encompass one additional side of the die, this represents a 5% increase in the chances of success.

The percentile increase relative to your previous chances of success is the only thing which is relevant.

Allow me to try to explain.
Say your character hits every monster everywhere on a 11+. As you have correctly pointed out, each pip on the d20 represents a 5% chance for that specific number to come up. This is not a direct increase of your chance to succeed of 5%, it is only one more number on the die which you might roll. And that is the key difference.

Your chance to succeed is (5% * (the range of success)). In this case your chance to succeed is 5% * 10 = 50%

You have a 50% chance to hit a monster. Now imagine you make 100 swings and roll exactly average. You will have hit the monster 50 times out of 100 swings, or 50/100. Which again shows your chance to succeed as being 50%.

Then you get a +1 somehow. A feat, a magic weapon, a bump in your stat, the cause is not important. Now your chance to succeed is 55%. Which may look like it improved your odds by 5%, and it did, but it did not improve your chance to succeed by 5%. Instead your chance to succeed went up by (100/(old range)/(new range - old range)), or (100%(10)/(11-10)), which is 10%

20 -> 19-20 = 100/(1/(2-1)) = 100%
19-20 -> 18-20 = 100/(2/(3-2)) = 50%
18-20 -> 17-20 = 100/(3/(4-3)) = 33%
.
.
.
11-20 -> 10-20 = 100/(10/(11-10)) = 10%
.
.
.
2-20 -> 1-20 = 100/(19/(20-19)) = 5%

As you can see, the only time you have an actual 5% increase in your chance to succeed is when you're improving from a 2-20 to a 1-20.

The formula may be a little clumsy, but it will work for greater increases as well. So if you win a +1 weapon at the same time that you level up and choose to bump your relevant stat by 2, you'll have a +2 over your old bonus. If you needed a 20, then you have improved your needed rolls to 18-20, which is:

20 -> 18-20 = 100/(1/(3-1)) = 200%
.
.
.
11-20 -> 9-20 = 100/(10/(12-10)) = 20%

Xetheral
2015-01-04, 03:33 PM
Everything Feldarove said, plus:

It's often discussed on these forums that 5% is not the only way to look at it. See, the game is balanced at the margins. Each +1 is not a linear improvement, because it's not about having a 5% better chance -- you are far more likely to be off by 1 than off by 20. That +1, when you're in the middle of expected zone, is going to have the practical effect of improving your chances of hitting by 10-20% relative to without the +1.

I don't know the exact math, but I know it's not linear. To illustrate with an extreme example, if you already had a +20 to hit somehow, the +1 would literally add 0% chance most of the time.

Without advantage or disadvantage, with a DC that is both hittable and missable with and without the bonus, then a +1 bonus will always provide an increase of 5 percentage points. Thus, under those circumstances, a +1 will change a failure into a success on average one in every 20 rolls. Another way to think about it: take any string of 14 d20 rolls, and there is about a 50% chance (51.23% to be exact) that a +1 would turn one or more of them from failure to success.

Selkirk
2015-01-04, 03:33 PM
Again, I see this trend of trying to consider feats in a sweeping manner without looking at context. These choices are all highly subjective depending on the particulars.

A caster who's typically blasting from the back ranks isn't going to get many AoO. He's not going to care about extra melee attacks, because he hardly ever melees unless he must.

A rogue with a high DEX has probably dumped STR to fight with finesse weapons and won't want polearm master. He may not value extra attacks that don't stack as well with his cunning actions feature. A support character like a bard or cleric may hardly ever be making attacks at all and be primarily using save-based effects on the rare cases when they do.

Reserve judgment on other people's choices until you hear their particular reasoning.

and i agree...but the game does support mechanically bad choices. the plate fighter using melee and thrown weapons shouldn't have any need for dex..and yet would be ill advised to make dex a dump stat. in the same way casters should be looking for anything that gives them a bonus/reaction move as opposed to small increases in dc's. we all want to be gandalf but the reality of play is the caster is flinging cantrips mostly and usually reacting to a situation not dictating it.

anecdotes be damned but for the last three encounters zombies(i don't think they took asi :smallredface:..:D) have been beating my spell dc's (wis 18 ...yeah i took asi-and i'm hating every minute of it :smallfrown:).

Kurald Galain
2015-01-04, 03:34 PM
The percentile increase relative to your previous chances of success is the only thing which is relevant.

I would say that the percentile increase is completely irrelevant, actually.

If you want to compare feats, especially numerical bonuses vs. extra options, a much more useful benchmark is "how many times per day would the feat make a difference to my character".

For example, the Lucky feat makes a difference three times per day. A feat that gives you a +1 bonus to melee attacks gives you a difference of (the average number of melee attacks you make in a day) divided by 20. If your character doesn't make melee attacks (e.g. because you're a wizard) then obviously that feat is not going to be useful to you. Is Lucky going to be better than +1 to melee attacks? The answer is yes, unless you make 60 or more attacks per day. You can refine this method some more if you need, or calculate it by gaming session instead of day.

The point is that this method allows you to actually compare diverse feats, whereas the "percentile increase" method only allows you to argue over whether a +1 bonus does or does not equal 5%.

Stella
2015-01-04, 03:38 PM
This is the true in the specific case of critical hits, yes. This doesn't represent diminishing returns, just one way of interpreting the numbers. As your chances of a critical hit grow larger, the addition of another +1 to your crit range is a lesser improvement relative to your newly increased odds of a crit - but how is that relevant?
No, this has absolutely nothing to do with critical hits, except in the specific case where you're trying to figure out what the improvement was to your chance to get a critical hit from one range to another.
And it does indeed represent diminishing returns. It is relevant because that is how you calculate the improvement to your chance of success. You don't add 5% just because each number on a d20 has a 5% chance of being the face number in the same way you aren't guaranteed to roll a 6 on a d6 if you roll it 6 times, even though the chance for any particular number to be the face number is indeed 16 2/3%.


Each expansion is still another face of the die (each face 5% likely to be the result) that represents a critical hit, and therefore a 5% increase in the chances of a crit.
I explained the math in precise detail above. You posted this while I was composing my own post. I suggest you read that post carefully, because you are quite simply wrong when you claim that going from a 20 to a 19-20 is only an increase in critical chances by 5%. It is an increase in critical chances by 100%. From 1 chance in 20 to 2 chances in 20.


You could make the same argument for hit dice. As your max hp grows larger, the new hp you receive each level is an exponentially smaller percentage of your total hp.This statement, while accurate, has nothing to do with the discussion.



But if you did roll a 4, you'd be glad you had that +1, wouldn't you? A +1 is equally significant at any point in the game, since the random nature of dice could at any point make that +1 the difference between failure and success.
This is again simply incorrect. As I detailed above, a +1 has a different significance at all steps along the die range needed for success.


Both are true. It is 5% more likely that you will crit on 19-20 than just 20, but since the original chance of critting on 20 alone was only 5%, this represents an increase of 100% but only relative to the previous chances of it happening. It just happens to be double.
Both are not true. Only the statement that a roll of 20 which has been improved to a roll of 19-20 is an improvement of 100% is accurate. The chance for any particular number to be the face number on the die (which is (100/faces) expressed as a percentage) is not relevant. Not at all.


This is an oft-had problem in this thread, it seems. Confusing different percentages derived from the same base numbers.

It may be an oft-had issue in this thread, but the simple facts of the matter is that it is you who are confused about the math involved.

Stella
2015-01-04, 03:48 PM
I would say that the percentile increase is completely irrelevant, actually.
You can claim that, but mathematically you'd be wrong. And if your math is wrong, then you've wandered off of the only road which is going to allow you to make any kind of apples to apples comparison.


If you want to compare feats, especially numerical bonuses vs. extra options, a much more useful benchmark is "how many times per day would the feat make a difference to my character".

That's an interesting statement, given that by using the correct math you can easily answer this question. Using the incorrect math there is absolutely no way to answer the question.

If you need a 20 to crit and you make 100 attacks, then statistically you'll get 5 critical hits.
If you need a 19-20 to crit and you make 100 attacks, then statistically you'll get 10 critical hits.

And what did using the correct math tell us about moving from a 20 -> 19-20? That it was a 100% improvement. And 10 is a 100% increase over 5.

If you use the wrong math and insist that you've only gained 5% then statistically you'd expect to only get 5.25 (5 * 1.05) critical hits, and you'd probably conclude that the improved critical wasn't worth it.

Selkirk
2015-01-04, 03:53 PM
it is a given that i don't understand this math...my head is spinning :D. but taking an example if you were statting an enemy boss. say a 4th level fighter and one of the fighters had an 18 str (took asi and all other stats are equal) and the other had 16 str but took polearm master would the cr's differ?

KhorashIronfist
2015-01-04, 03:57 PM
I'm not going to go through all this business of dividing up a bunch of quotes in my post and responding to each, as this is completely off-topic anyway.

The fundamental point is thus:

You are confusing calculating chances of success with calculating your percentage of improvement (which is irrelevant). By your logic, the best choice in any given situation is that which gives you the largest percentage of increase over your previous odds of success. So, going back to my analogy of finding money on the ground, considering the following two scenarios:

A in which you have $100 in your wallet and find $20 laying on the ground. Here you improve your total funds by 20%.

or

B in which you have $1 in your wallet and find another $1 laying on the ground. Here you improve your funds by a staggering 100%!

Between these two, which would you rather find on the ground? Would you consider how much money you had already before deciding which is the superior option? Of course not, you would look only at the amount of money being picked up, not the percentage of increase it offered.

The percentage by which you improve what you already have is irrelevant. What matters is only the objective value of what you are gaining - in this case, a +1, or a 5% increase in the chances of a successful roll. It does not matter whether you already have many plusses to this roll, unless it is truly impossible for you to fail (which is never the case in D&D, but I digress), in which case yes, a +1 is irrelevant. But if there is any chance of failure, and failure is something you want to avoid, then a +1 is always good, and always represents a 5% increase in the likelihood of success, since the nature of modifiers and DCs renders all randomized decision making (ie dice rolls) into a window where the result is somewhere between 1 and 20 and THEREFORE the chances of success, expressed, as you said, by (5% *(range of success)) can only increment in multiples of 5.

If you need roll a 19 or 20 to succeed, your range of success is 2 and so the odds of succeeding are 10%. If you add another +1 to your roll, making 18, 19, or 20 a successful roll, your range of success is 3 and so the odds of succeeding are 15%. It is COMPLETELY IRRELEVANT what other plusses there are involved, since the very process of making a d20 roll in D&D renders all down onto the result of that die, where each face has a 5% chance of being shown and so any odds of success calculated from a d20 roll will be some multiple of five.


This will be the last I say on the matter, since it seems clear your opinion is not that of the majority, and that you are engaged in some serious misunderstandings of how statistics work. This is not an MMO.

Xetheral
2015-01-04, 04:04 PM
Stella, the proportional increase in the chance to succeed on a d20 roll isn't necessarily the most useful gauge of the worth of a +1. For example, (again assuming no adv/dis and a hittable/missable DC) a +1 to hit always represents a fixed increase in applied damage (specifically 5% times average damage) on a normal attack. Yes, against hard-to-hit enemies that static value could represent up to a 100% increase, but that may be far less relevant than the (modest) absolute increase in damage.

I'll second Kurald (edit: and others) in claiming that the proportional increase is usually irrelevant. It is important when trying to maximize applied damage through a tradeoff of bonuses and penalties to hit and damage (e.g. optimizing burst damage or evaluating GWM/sharpshooter), but otherwise I'd argue that thinking of a +1 as (an equally mathematically valid) 5 percentage point increase is usually the more useful metric.

Kurald Galain
2015-01-04, 04:05 PM
That's an interesting statement, given that by using the correct math you can easily answer this question. Using the incorrect math there is absolutely no way to answer the question.
Indeed. It is ironic, then, that you proceed by using the incorrect math, and indeed end up with no answer to the question.

See, I wasn't asking "is +1 to crit chance a good feat", because that cannot be answered (as there is no definition of "good"). I was specifically asking "is +1 to hit a better feat than Lucky" by the explicit metric of "how often per day does it make a difference".


If you need a 20 to crit and you make 100 attacks, then statistically you'll get 5 critical hits.
Yes. So the correct math tells us that if you make 100 attacks per day (which appears to be hyperbole) then that improved crit feat makes a difference 5 times per day, and in this particular case it's better than Lucky. Since 100 attacks per day is a silly overstatement and 20 attacks per day would be much more accurate in any campaign that I've seen, in those campaigns, the Lucky feat would be better.

Now this is actually a useful method, since it tells you which feat to prioritize based on the kind of campaign you're in; whereas the other method only allows you to argue over whether a +1 bonus does or does not equal 5%.

Mitchellnotes
2015-01-04, 04:15 PM
We've had frequent threads about this subject...

Given your contributions to this thread so far, and the derailment of how best to balance giving a bonus feat earlier in a characters career into ASIs vs feats, I would purport you are a cause of, and not solution to, this phenomena.

Rather than making blsnket statements about "math," perhaps you could actually respond to my original post about how best to do this?

Alcino
2015-01-04, 04:22 PM
As has been explained most thoroughly, +1 on a d20 should not be interpreted as a blanket +5%. If your to-hit goes from 19-20 to 18-20, you hit 50% more! In other words, you deal 50% more damage and defeat enemies in 1/3 less time.

However, that only applies "in a vacuum". If you're really hitting an enemy only once or twice in ten attacks, are you sure this is the way to go? Should you not change your attack plan or run way?

And even if you're always fighting heavily-armored enemies and can only hope for hit ranges around 16-20 at most... how is the rest of your party doing, on average? I mean: if, for every hit of yours, the fighter gets 5 hits, your 50% increase to hit chance has a negligible impact.

What I'm arguing, really, is that "5%" is wrong mathematically, but the much higher, mathematically-correct percentages do not apply well to real in-game situations. I interpret each +1 on a d20 as about 10%.


Since 100 attacks per day is a silly overstatement and 20 attacks per day would be much more accurate in any campaign that I've seen, in those campaigns, the Lucky feat would be better.

Better than you'd think, and worse than you think. On one hand, one more successful attack is rarely as important as what Lucky will help you with, such as a critical save or Stealth check. On the other hand, the +1 to attack would have an effect once a day (maybe more at high levels, but at that point, each individual attack is "weaker"), but Lucky has about 50-60% chance of changing a roll's result, thus making a difference only 1 or 2 times a day, in practice.

Selkirk
2015-01-04, 04:38 PM
the answer is in original post in regards to asi vs feat



Clearly having a feat at first level isnt overpowered bc variant humans can get them, though they give up a bit to get one. What it also does is make Variant human the "go-to" race often as well.


and we also know that the designers consider asi to be merely +5%. why else would they have thrown out +2 asi around to every race but reserved the feat for variant humans? obviously feats(certain ones :D) are better than asi by designer's intent.

further we know this based on cr values...do we care if the orc chief has a str of 16 compared to say str 14? of course not and chances are we won't even notice it.
but if the orc chief has polearm master we know it immediately and fear him :D.

Stella
2015-01-04, 04:40 PM
I'm not going to go through all this business of dividing up a bunch of quotes in my post and responding to each, as this is completely off-topic anyway.
It isn't off topic at all, the mathematical basis by which you show improvement allows people to decide if they want that % improvement or if they want a feat which gives them other options.


You are confusing calculating chances of success with calculating your percentage of improvement (which is irrelevant).

No, I am not. I am quite aware what the chances of success are for any given spread on any kind of die. And I am also quite aware of how to calculate the improvement to those odds of success for any spread on any kind of die.
It is you who are confusing the odds of any number on a die coming up as the face number with any linkage to the improvement in odds for success. Read the blog posted by Selkirk if you'd like to see an article which uses the same math to arrive at the same conclusion.

As I explained before, your way of looking at a +1 on a d20 as a +5% improvement is just as flawed as thinking that because every face on a d20 has a 5% chance to be the face number that 20 rolls gives you a 100% chance to roll any particular number. It does not, and you can test this for yourself if you don't believe me. The easiest test would be flipping a coin. Each side of a coin has a 50% chance to be the face side after a flip. So do this 10 times: Flip a coin twice looking for heads, and write down the results. You are very unlikely to find that all 10 tests gave you a heads result.

And hey, if your 10 tests manage to all give you a heads result then we need to get together and flip coins or roll dice for money. I could use some extra income, and finding people who will draw to an inside straight is becoming more difficult.


By your logic, the best choice in any given situation is that which gives you the largest percentage of increase over your previous odds of success.
For your D&D character, and really only to understand how different choices might impact how their character plays.

Your "money on the ground" analogy is quite irrelevant to the math involved. If your character has a choice between a +1 or a +3 the player doesn't even need to understand the math, it's clear to most people that a +3 is better than a +1.

But to decide which will improve your character's average damage output: Improved Critical or a +1 to hit? There the math is quite useful. More than useful, as without the math you'll never know which choice was better. As long as you use the right math, of course.


The percentage by which you improve what you already have is irrelevant.

And yet once again, it is the only thing which is relevant. The only way to make an apples to apples comparison between two choices is to understand how the mathematical mechanics behind your character's chances for success works. And if you fail to understand how to correctly apply that math you'll never have a true or accurate understanding of the relative values between your choices.


What matters is only the objective value of what you are gaining - in this case, a +1, or a 5% increase in the chances of a successful roll. It does not matter whether you already have many plusses to this roll, unless it is truly impossible for you to fail (which is never the case in D&D, but I digress), in which case yes, a +1 is irrelevant.
And yet again, you are quite simply wrong. Your amount of improvement for a +1 (or any given bonus) can be easily seen to be different at every step of the way along a 1-20 progression. And the amount of improvement you gain by adding a +1 to a 8-20 success rate is much smaller than that gained by adding +1 to a 15-20 success rate. Basic math completely refutes your statement that "It does not matter whether you already have many plusses to this roll." It absolutely does matter, and the math bears out the fact that it matters. The fact that you still do not understand this after I've taken great pains to explain how the math works makes me sad.

If you want to make another rebuttal of how math works, I'm happy to give you the last word. I don't need to repeat myself yet again to be right, the math supports me and that's all that matters.

archaeo
2015-01-04, 04:43 PM
However, that only applies "in a vacuum".

This entire argument is happening in a vacuum.

In reality, at an actual table, a player would have to make seriously suboptimal choices to be rendered useless in play. I imagine that, in a game where one player has taken every ASI, another player has taken a feat at every available opportunity, and the final player has balanced ASIs and feats, those three players would all feel able to contribute meaningfully to the game, without significant loss of effectiveness given a sufficiently long sample of play, even if the players are otherwise all playing the exact same character. One might have a higher to-hit bonus, and another might have more wide-ranging utility, but it would likely end up being negligible in effect, if not in length-of-character-sheet or something.

Mitchellnotes
2015-01-04, 05:04 PM
the answer is in original post in regards to asi vs feat



and we also know that the designers consider asi to be merely +5%. why else would they have thrown out +2 asi around to every race but reserved the feat for variant humans? obviously feats(certain ones :D) are better than asi by designer's intent.

further we know this based on cr values...do we care if the orc chief has a str of 16 compared to say str 14? of course not and chances are we won't even notice it.
but if the orc chief has polearm master we know it immediately and fear him :D.

The appeal of the variant human is both the feat and the ability to still start with a 16 in any ability with point buy since they get two +1's. As a thought experiment, take those ability increases away and see if it lessens the appeal of the variant human. It's the ability to gain a feat with little opportunity cost.

Also, we keep using polearm mastery as the example feat. Not every feat is polearm mastery. How does it compare to something like healer?

Stella
2015-01-04, 05:06 PM
Indeed. It is ironic, then, that you proceed by using the incorrect math, and indeed end up with no answer to the question.
Please demonstrate how my math is incorrect, or retract that statement.


See, I wasn't asking "is +1 to crit chance a good feat", because that cannot be answered (as there is no definition of "good"). I was specifically asking "is +1 to hit a better feat than Lucky" by the explicit metric of "how often per day does it make a difference".
And I ignored your question, while still rebutting your assertion that the percentile increase is what is irrelevant.

It's very difficult if not impossible to compare two things with different mechanics and arrive at an answer which a majority will accept. Even if you can work out the math people are going to, as you did below, nit pick things which must be assumed in order to make the comparison such as number of attacks per day.


Yes. So the correct math tells us that if you make 100 attacks per day (which appears to be hyperbole) then that improved crit feat makes a difference 5 times per day, and in this particular case it's better than Lucky. Since 100 attacks per day is a silly overstatement and 20 attacks per day would be much more accurate in any campaign that I've seen, in those campaigns, the Lucky feat would be better.

Look, you can try to find fault where there is none, but it seems as though you've just contradicted yourself when you claimed above that I used incorrect math. I made no comparison with Lucky, and so you can't call my figures incorrect based on your assumption that I somehow did, when my post was very clear about what I was demonstrating. The value of improved critical is a 100% increase in the number of criticals per swing assuming your weapon has a range of 20 to begin with. And since we've devolved to nit picking instead of discussing things cordially I'll also take this opportunity to point out that the 20 attacks per day you've decided is reasonable becomes 80 attacks per day, not counting action surges etc. at 20th level. 80 is a lot closer to 100 than your figure of 20, isn't it? Look how easily it is to attempt to make a person look wrong just because they throw out an example!

When calculating statistics it is generally valuable to use a larger number, especially one which converts nicely to a percentage, it is helpful in allowing others to understand the results. So you can call my example hyperbole or "a silly overstatement" in some kind of an attempt to discredit the results through the use of ad hominem attacks, but if your character lives they will be making those 100 rolls at some point, right? And since the answer to the hypothetical question is "yes", my example is just fine for demonstrating the mathematical results.

Kurald Galain
2015-01-04, 05:15 PM
It's very difficult if not impossible to compare two things with different mechanics and arrive at an answer with a majority will accept.
No, it's really not.

All earlier editions of the game have well-known and consensually accepted comparisons for "things with different mechanics", and the only reason why 5E doesn't have one yet is because the game is fairly new. Handbooks are already springing up that list which feats are better or worse picks for any particular class.

In all cases, the relevant question for building a character is "should I pick feat X or feat Y or ability boost Z". The relevant question is not, and has never been, "is a +1 bonus more or less than 5%". One is practical advice, the other is theoretical math. Now I appreciate that you've tried to answer the latter question, but what we're really looking for is the former.

It really doesn't matter whether Improved Critical increases your crit range by 1%, 6%, or 42%; it matters whether the feat is more effective than some other feat my character might take. And it should be possible to compare those feats even if the second feat does something wildly different from increasing your crit range.


So you can call my example hyperbole or "a silly overstatement" in some kind of an attempt to discredit the results through the use of ad hominem attacks,
Calling your example hyperbole is the exact opposite of an ad hominem attack :smallbiggrin:

Stella
2015-01-04, 05:47 PM
No, it's really not.

All earlier editions of the game have well-known and consensually accepted comparisons for "things with different mechanics", and the only reason why 5E doesn't have one yet is because the game is fairly new. Handbooks are already springing up that list which feats are better or worse picks for any particular class.
And if the people who arrive at those lists fail to use what you're calling "theoretical math" (it isn't theoretical, it is quite practical), then they will probably be publishing an inaccurate list because their conclusions won't be based on the correct math.

Perhaps you're just used to accepting lists others publish as gospel rather than understanding how the assessment of value was arrived at. I am not.


In all cases, the relevant question for building a character is "should I pick feat X or feat Y or ability boost Z". The relevant question is not, and has never been, "is a +1 bonus more or less than 5%". One is practical advice, the other is theoretical math.
And again, if the persons who are giving this supposed "practical advice" are not able to crunch the numbers to arrive at a quantitative value which supports the advice they are giving, then they are very likely to be giving bad advice.

At the risk of opening myself up to your proclivity to nit pick irrelevant details of examples, I'll offer an example.

If I were to crunch the numbers in order to rank two things in order of value: Taking a +2 to STR or taking the Improved Critical feat at any given level in which a feat was earned, I'd need to be able to do the following:

1) Make an assumption on the average AC of the opponents faced, which isn't going to be exact but which a stroll through the MM can provide a rough average for.

2) This will allow me to calculate an estimated "current hit %" for a given number of swings. I'd use 100 swings because I know you love that figure.

3) Use the +1 "to hit" to calculate the improvement in hit %, you know, the exact thing you're claiming has only "theoretical value", in order to calculate the new number of hits per 100 swings.

4) Calculate the average damage in step 2) and then use the % improvement learned in step 3) to calculate the new expected damage, which is based on both the new "to hit" % and the new STR.

5) Calculate the % damage increased using the figures arrived at in step 4).

6) Calculate the average number of critical hits for the same 100 swings based on a 20 and a 19-20.

7) Calculate the % improvement in damage due to the Improved Critical feat.

8) Compare the % improvement in damage if the player takes the +2 STR compared to the % increase in damage if the player takes the Improved Critical feat.

9) Make a determination of which choice is better.

10) Repeat steps 1-9 for each point at which a player might select between these two options.

11) Arrive at a conclusion on which is better, and publish that conclusion and all backing assumptions (AC of foes, STR before/after, etc.) and ask for peer review.

Assuming the peer review didn't point out any required changes I could then park these calculations in a spread sheet so that I could then easily plug in values to compare things like bumping the assumed AC up or down, 2WF, 2HF, Sword and board, weapons with different threat ranges, etc.

But unless I know how to calculate improvement % the best I can hope to say is "This one will do more damage than that one." Which might have value as a definitive statement but which is sadly lacking in providing how much improvement the player can expect.


It really doesn't matter whether Improved Critical increases your crit range by 1%, 6%, or 42%; it matters whether the feat is more effective than some other feat my character might take.
And there is no way to compare the effectiveness of two different feats unless you use the correct mathematics to do so.



Calling your example hyperbole is the exact opposite of an ad hominem attack :smallbiggrin:

Wrong again.


hy·per·bo·le
hīˈpərbəlē/
noun
noun: hyperbole; plural noun: hyperboles

exaggerated statements or claims not meant to be taken literally.

My example was neither exaggerated nor not meant to be taken literally. Using ad hominem language against me as an attempt to discredit findings which I backed up with simple mathematics is discourteous at best, and most likely simply rude.

Dalebert
2015-01-04, 05:51 PM
Stella, I don't anyone is questioning your math so there's no need to keep repeating it. The statements you are making are technically correct, e.g. 5% crits becoming 10% crits is a doubling of previous ability to current. That's not what they're disputing. I don't care enough to argue about it given that I've already spent too long on this thread. (I wrote the long post before before adding this part) Just pointing out there's a communication failure about what exactly is being claimed and disputed.


If you want to compare feats, especially numerical bonuses vs. extra options, a much more useful benchmark is "how many times per day would the feat make a difference to my character".

For example, the Lucky feat makes a difference three times per day.

These are very good points but it's still not mechanically, mathematically, and objectively better for everyone beyond just preferences. That's just an outrageous claim. For starters, there are some things lucky won't affect.

* It won't affect whether a target makes their saving throw. In particular, AoE attacks come to mind where multiple creatures are all making saves in one turn. Also, because you're blowing a spell slot, this is a time when you presumably really want to succeed more and really chisel down the HP of a lot of mobs. It won't affect whether you land Bane, Faerie Fire, do full damage with a Shatter, etc. Even a single target non-cantrip with a save represents a sacrifice of resources and therefore a greater loss if it fails than a regular cantrip or melee attackc and Lucky won't help these. Picture a cleric who's primary standard attack is Sacred Flame. That's a save or nothing effect. Lucky is useless for landing it.
* My warlock is taking agonizing blast at level 5 when I can get my first ASI. That's also when I start getting two blasts per attack. That's an extra dmg for every hit at a time when my standard attack doubles. So if my number of rolls just on my common attacks was only 20 a day up to now, it's roughly 40 now, twice as many opportunites to hit that all have a better chance and do more damage due to ASI. Meanwhile, I still only have 3 lucky rolls. They don't scale to my increase in attacks.

But it gets more complicated, because if I take lucky, I'm going to use it strategically. For instance, I'd be very inclined to save one of them for a very important roll that won't happen most days, like that one time I looked the medusa in the eye and failed. So many days will pass when I end the day with a wasted lucky roll. Also, even though it CAN be used for basic skill checks, I would tend to feel like that's a waste of it. I'd be inclined to save them for a tough save, a key concentration check, or an important attack that has a chance to down a particularly problematic enemy, for instance. It's a limited resource that I have to manage and those decisions will stress me regularly. Meanwhile, my ASI will boost lots of things practically all the time, often in ways that stack with each other like better to-hit combined with consistently higher dmg, without a concern for extra resource management.

So it's simply not clearly mechanically better. There's all kinds of judgment involved that calls for more than math and comparing numbers, times of usage per day, etc.

Kurald Galain
2015-01-04, 06:06 PM
These are very good points but it's still not mechanically, mathematically, and objectively better for everyone beyond just preferences. That's just an outrageous claim. For starters, there are some things lucky won't affect.
You are correct. I'm not claiming that my benchmark is perfect, merely that it's a useful basis for practical advice.

For example, suppose that one feat gives you +1 to hit and another feat gives you +1 to AC, then the choice should be pretty straightforward: if your character is attacking more often than he's being attacked (e.g. he's an archer) then he should pick the former. If your character is being attacked more often than he attacks himself (e.g. he's a frontline defender) then he should pick the latter. If he's making more bluff checks than either of that (e.g. because it's a social campaign) then he should look for a feat that boosts bluff checks. The basic advice remains, if you want a mechanically good feat then start by looking at the ones that helps you most often.

Xetheral
2015-01-04, 06:22 PM
At the risk of opening myself up to your proclivity to nit pick irrelevant details of examples, I'll offer an example...

...And there is no way to compare the effectiveness of two different feats unless you use the correct mathematics to do so.

Stella, you're overlooking the fact that there isn't just one mathematically-correct metric for gauging the effectiveness of a bonus. You're looking at the proportional increase, which, as I mentioned above, is useful for some optimization tasks. But you're ignoring other equally-valid metrics such as, for example, absolute (static) increase and frequency-of-benefit. You might prefer a given metric for a given analysis task, and that's fine, but your preference doesn't make everyone else's math wrong.

Without Advantage or Disadvantage, with a DC hittable and missable with or without the bonus, a +1 represents ALL of the following:

- A variable proportional increase in chance to hit, depending on the DC
- A static .05 (i.e. 5 percentage point) increase in chance to hit, regardless of the DC
- A benefit realized, on average, once in every 20 d20 rolls, regardless of the DC

Stella
2015-01-04, 06:43 PM
You might prefer a given metric for a given analysis task, and that's fine, but your preference doesn't make everyone else's math wrong.

Math isn't a preference or a matter of opinion. How often you should bathe is a preference and a matter of opinion. Anyone who claims that a +1 is a 5% increase in effect is simply wrong. And those who claim that where you are in regards to success chances does not matter to that +1, in other words that it's a flat 5% increase regardless of your current odds of success, are twice as wrong.

There's no "different metric" by which misunderstood math suddenly becomes good science...


Without Advantage or Disadvantage, with a DC hittable and missable with or without the bonus, a +1 represents ALL of the following:

- A variable proportional increase in chance to hit, depending on the DC
- A static .05 (i.e. 5 percentage point) increase in chance to hit, regardless of the DC
- A benefit realized, on average, once in every 20 d20 rolls, regardless of the DC

These statements are all correct. However they are also irrelevant if you want to understand exactly how much that +1 has improved your character. Or compare that +1 to some other effect. To do that you need to measure impact, not static values.

If you're looking at improving your critical threat range by +1 and see that it is the same +1 number as the same feat invested in STR, you might conclude that they are exactly equal. But they aren't, except perhaps under one very specific set of input values where their lines may cross. Looking at the +1 as simply being any of the above examples you give is very misleading as to the actual value you are getting from that +1.

Justin Sane
2015-01-04, 07:14 PM
Math isn't a preference or a matter of opinion.The interpretation of said math, however...

Xetheral
2015-01-04, 07:17 PM
Math isn't a preference or a matter of opinion. How often you should bathe is a preference and a matter of opinion. Anyone who claims that a +1 is a 5% increase in effect is simply wrong. And those who claim that where you are in regards to success chances does not matter to that +1, in other words that it's a flat 5% increase regardless of your current odds of success, are twice as wrong.

There's no "different metric" by which misunderstood math suddenly becomes good science...

So long as the calculations are correct, how one uses and interprets the results is absolutely a matter of opinion.

No one here is claiming that a +1 is a 5% proportional improvement. The "5% increase" people refer to is a reference to the 5 percentage point improvement that +1 represents. You have at least some background in math, so I suspect you're familiar with the fact that colloquial use of the word "percent" can refer to both the proportional increase or the absolute increase, and which one is usually able to be discerned from context. (This ambiguity isn't a good thing, but it's how people use language).


These statements are all correct. However they are also irrelevant if you want to understand exactly how much that +1 has improved your character. Or compare that +1 to some other effect. To do that you need to measure impact, not static values.

(Emphasis added.) The notion of "improvement" is always subjective until there is an agreed-upon metric by which to judge progress. You seem to be emphasizing damage-maximization is the ultimate metric, but mathematically that metric isn't inherently superior to any other.

And static values are certainly relevant to judging improvement by a wide range of metrics, and indeed are essential even to certain damage-focused metrics such as "damage dealt as damage reduction", "rounds until victory", and "resources expended to achieve victory".

Keko
2015-01-04, 08:57 PM
Premise 1: math is not my stuff
Premise 2: english is not my language so I may be a bit confusionary

I agree 100% totally aggree (:smallamused:) with Kurald Galain, KhorashIronfist and Xetheral (and probably some other I missed).

Stella, from what I see what you fail to understand is that increased percentage of success is different from percentage of the increase of success.

Percentage is referred to its original value, wich is a powerful tool but not always the right one: +100% of a little if often still a little.

Let's say I'm a 19 level fighter with 8 cha, should I take +2 cha? Spell save DCs should be 19 now so I save only on a roll of 20 (So I have 5% of resisting). If I choose +2 cha I save on a 19-20 (So I have 10% of resisting). Of course 10% is double of 5% (alias 10% is +100%[or 200%] of 5%) and this is what you correctly claim, but this is not the relevant thing: having 5% of resist is not much nor is 10%.
I doubt anyone will say that this is an appreciable increase even if it is 100% more, while a 10% elsewhere may be appreciable.

And, as a side note I agree that most builds have at least a few feats worth more than an ASI.

Back on topic, as a DM I gave my players at level 4 both an ASI and a feat but just because I thought of it later, probably level 2 or 3 would have been better. Anyway it didn't break the game while it gave players some nice options and more flavour :smallsmile: .

AbyssStalker
2015-01-04, 09:17 PM
My opinion is that feats are (typically) much better than the stat increase, however randomness dictates that this entire argument on this subject is ridiculous as many are arguing that one choice is objectively better, your math only gives us the likely-hood of the increase, instead of how the die will actually roll when your playing, a 5% increase may help one player more than another, purely due to how a die rolls each time he uses it. Not because you guys try to throw psuedo-objective math conjecture at one another.

Now excuse me while I sacrifice many a young lamb in the name of the RNG gods.

Stella
2015-01-05, 01:21 AM
My opinion is that feats are (typically) much better than the stat increase, however randomness dictates that this entire argument on this subject is ridiculous as many are arguing that one choice is objectively better, your math only gives us the likely-hood of the increase, instead of how the die will actually roll when your playing, a 5% increase may help one player more than another, purely due to how a die rolls each time he uses it.

Randomness is exactly why we need to apply math to determine the average benefit of a feat or a stat increase. Otherwise you'll be telling people something equivalent to "Hey, this feat sucks for the most part, but if you get really lucky it will be great for you!"

Math tells us that buying a lottery ticket is a foolish choice, because on average you lose much more than you win. No one sets up a game of chance involving money and expects to lose money, after all, and the lotteries in the US which have the greatest payoffs are all state run affairs. Often multi-state affairs. They will never lose money, guaranteed. Because they use math to ensure that it will never happen. But people still manage to win the lottery, and no one ever seems to care that the lottery organizers still make money in net no matter how much they pay out to a big winner. If you look at what the winner of a multimillion dollar lottery made and weight it only against what she wagered, the only conclusion you can logically reach is that buying a lottery ticket is a smart thing to do. But that ignores the evidence which a proper application of math provides us with: Statistically you will lose much more than you will win if you play the lottery.

Stating that the random die rolls will turn up differently between one player and another is an accurate statement which is also irrelevant to the discussion. We know that across a few million players of the lottery that there may well be one lucky winner. That single case does not make the rule. The rule is that most people who play the lottery will lose money. And most lotteries state the odds right on the ticket.

This is similar to the anecdotal "evidence" which has been presented previously in this thread. When presented with the fact that a character who has a 5+ hit chance is not helped nearly as much as the player who has a 15+ hit chance, it has been argued
But if you did roll a 4, you'd be glad you had that +1, wouldn't you? A +1 is equally significant at any point in the game, since the random nature of dice could at any point make that +1 the difference between failure and success.
All the while ignoring the fact that the character who needs a 5+ to hit only gains a fraction of the benefit which a character who needs a 15+ to hit gains when they apply that exact same +1 to their die roll. In other words, not simply ignoring but actually denying that the +1 is quite simply not "equally significant" at all.

"The random nature" of lottery tickets means that someone is going to win, eventually. Unfortunately that luck does not apply to all people who play the lotto, most of them simply lose and discard their tickets as worthless. As I said above, this equates to: "Hey, buying a lottery ticket sucks for the most part, but if you get really lucky it will be great for you!"

Again, the only way to make a fair judgement of the benefits of a bonus is to weigh it against the expected results derived from the law of averages. That result is what most players will see. Some will be poorer due to bad rolls, some will be better due to good rolls, but on average you can exactly define the result which will be seen across the board. You simply cannot make a fair judgement by pointing to the extreme case, lucky or unlucky, and claiming that this case has an equal value for all other cases.

Xetheral
2015-01-05, 06:05 AM
But if you did roll a 4, you'd be glad you had that +1, wouldn't you? A +1 is equally significant at any point in the game, since the random nature of dice could at any point make that +1 the difference between failure and success.
All the while ignoring the fact that the character who needs a 5+ to hit only gains a fraction of the benefit which a character who needs a 15+ to hit gains when they apply that exact same +1 to their die roll. In other words, not simply ignoring but actually denying that the +1 is quite simply not "equally significant" at all.

It *is* equally significant (in the colloquial sense) when considering the absolute increase (.05) in chance to hit. Absolute increases are just as valid a method of comparison as proportional increases. Each method has advantages and disadvantages in different contexts, but mathematically, both are useful metrics. Neither is inherently superior to the other.

That being said, in this particular context of comparing ASIs to Feats, I'll still argue that considering the absolute increase is advantageous over considering the proportional increase. We clearly disagree on this point, so let me try explaining my perspective in a different way:

Consider a raffle with 20 total tickets available for sale. In a raffle, the expected value of each ticket depends only on the price per ticket and the prize amount. Notably, the expected value of each ticket does NOT depend on the number of tickets already purchased. Imagine one player is rich and has already purchased 10 tickets. Another player is poor, and has only purchased one ticket. Even though an extra ticket would double the poor player's chance of winning, yet only increase the rich player's chance of winning by 10%, the expected value to each player of purchasing an extra ticket is identical.

Note that mechanically, the raffle and a d20 roll are themselves identical. In each case one has a certain number of equally-likely possible outcomes that are a "win" and a certain number that are a "loss". For exactly the same reason that the expected value of each raffle ticket is identical, the expected value of each +1 bonus on a d20 roll is also identical. (As always, ignoring advantage/disadvantage, and only dealing with DCs that are hittable and missable with and without the extra bonus).

When comparing an ASI to a Feat, I'm interested in the expected value of each. As in the raffle example, the expected value of the +1 bonus from the ASI does not depend on the proportional increase in chance to hit. I therefore find the proportional increase basically irrelevant in this context. Furthermore, because the +1 will be used across a variety of DCs, evaluating the impact of the proportional increase is much more complicated. (Either we'd need to arbitrarily pick a "sample" DC for optimization purposes, or we have to consider the distribution of the expected DCs.)

So, the bottom line is that the absolute increase from the +1 is both more useful for gauging the expected value of the bonus, AND easier to apply. To me, that makes it the superior metric for weighing ASIs vs Feats.

Knaight
2015-01-05, 06:24 AM
That's technically a non-sequitar. Just because it's the smallest bonus, doesn't mean it's not a big deal.


It is a 5% increase. It is not nothing, but no, it really isn't a big deal.

Whether or not it actually is a big deal has absolutely nothing to do with whether or not it is a non-sequitur. the claim was that it isn't a big deal because it is the smallest bonus, if it isn't a big deal because of some other reason the claim is still wrong. In this case, it isn't a big deal because it is a 5% absolute increase. If the smallest bonus was +1/2, it would still not be a big deal. If the smallest bonus were larger (or, more realistically, the system used a die system where a +/- 1 represented a larger change) it would be.

Logosloki
2015-01-05, 07:05 AM
The appeal of the variant human is both the feat and the ability to still start with a 16 in any ability with point buy since they get two +1's. As a thought experiment, take those ability increases away and see if it lessens the appeal of the variant human. It's the ability to gain a feat with little opportunity cost.

Also, we keep using polearm mastery as the example feat. Not every feat is polearm mastery. How does it compare to something like healer?

If you are point buying then normal human is a competitive option. Sure you lose out on a skill and a feat but you can gain 16, 14, 14, 12, 12, 11 as starting stats to put where you want.

Kurald Galain
2015-01-05, 07:10 AM
If you are point buying then normal human is a competitive option. Sure you lose out on a skill and a feat but you can gain 16, 14, 14, 12, 12, 11 as starting stats to put where you want.

If you are point buying then normal human is completely not worth it. It loses out on a skill and a feat, and it doesn't get a higher primary ability score than anyone else. Tertiary ability boosts don't matter even nearly enough to make up for that.

Stella
2015-01-05, 07:26 AM
When comparing an ASI to a Feat, I'm interested in the expected value of each. As in the raffle example, the expected value of the +1 bonus from the ASI does not depend on the proportional increase in chance to hit. I therefore find the proportional increase basically irrelevant in this context.

If your example stipulates that the value of the +1 bonus "does not depend on the proportional increase in chance to hit" then your example is flawed, because all analysis of the value of a given +1 depends entirely upon the values of the chance to hit both before and after the +1 is applied.


Furthermore, because the +1 will be used across a variety of DCs, evaluating the impact of the proportional increase is much more complicated. (Either we'd need to arbitrarily pick a "sample" DC for optimization purposes, or we have to consider the distribution of the expected DCs.)

This is far easier than you propose. I have shown the math for making these calculations, and it isn't any harder than the average 6th grade student should be able to calculate. The only issue is with disassociating peoples perceptions (eg. +1 on a d20 is a 5% increase!) with the actual impact that +1 provides.

And yet again, Selkirk provided a link to an independent blog which demonstrates the exact mathematical truisms which I am attempting to demonstrate here. Please go read that blog if you feel that I have the math wrong.


So, the bottom line is that the absolute increase from the +1 is both more useful for gauging the expected value of the bonus, AND easier to apply. To me, that makes it the superior metric for weighing ASIs vs Feats.
The fact that you find it easier to use incorrect math to weigh the relative values of one feat against another (to include the stat boost a feat may provide) is irrelevant. Math does not rely upon your ease of use in order to be accurate. There is a long history of ignorant legislature bodies assigning the value of PI to be 3 because they found it easier. Their efforts did not change the fact of the actual value of PI.

Kurald Galain
2015-01-05, 08:06 AM
There is a long history of ignorant legislature bodies assigning the value of PI to be 3 because they found it easier. Their efforts did not change the fact of the actual value of PI.

No, there isn't (http://www.snopes.com/religion/pi.asp). Not a "long history", just a single one, and that one is an urban legend. Please get your facts straight; this is hardly the only factual mistake you've been making in this thread.

MarkTriumphant
2015-01-05, 10:33 AM
Wrong again.


My example was neither exaggerated nor not meant to be taken literally. Using ad hominem language against me as an attempt to discredit findings which I backed up with simple mathematics is discourteous at best, and most likely simply rude.

I note that while you gave a definition of "hyperbole", you did not provide one for "ad hominem", which is where you are incorrect.

From Wikipedia: "An ad hominem (Latin for "to the man" or "to the person"[1]), short for argumentum ad hominem, means responding to arguments by attacking a person's character, rather than to the content of their arguments."

The "attack" was on your example, not directed at you, and therefore it was not ad hominem.

Person_Man
2015-01-05, 11:48 AM
RE: Is a +2 ability score increase (+1 bonus/5%ish success rate) really worth it?

In my opinion, a lot depends on how often you roll using that ability score. For example, my primary character is a Rogue. Dexterity effects his Initiative (rolled every combat), attack rolls (which he makes 1-3 times almost every round in combat), AC (which is targeted by enemies 1-4 times per round), Stealth (which he uses at least once between every combat), Dex saves (the most often targeted Save), Acrobatics (which is occasionally targeted or used), and occasionally other stuff as well. In a 4-6 hour sitting, there are probably 50-100ish rolls that are effected by my Dexterity. So have a 20 Dex as opposed to an 18 Dex makes my character succeed (or force my enemy's attack to fail) around 2-5ish+ times every game, which is slightly better then the Lucky Feat (which I also have). It's probably even more effective/important to have that additional Dex when we don't get to Rest very often and the party's spells and other Rest related resources start running low, which happens from time to time.

On the flip side, if your primary ability score is Intelligence and its modifying 5-10ish rolls per game session (and thus potentially changing the outcome of maybe 1 roll per game session) then having 20 Int over 18 Int really isn't that big of a deal.

In 5E, many many things depend on the context of your individual game and character.

Kurald Galain
2015-01-05, 12:46 PM
In 5E, many many things depend on the context of your individual game and character.
Just like in every other RPG, of course :smalltongue:

That said, while you make a good point about dexterity, what you just wrote can basically only apply to dexterity, because it applies to both your to-hit roll and your armor class. It's the attacks and counterattacks that add up; all the others (init, skills, ref saves) aren't nearly as common. Dexterity is basically the god stat in any game that has a dexterity stat, so that's not exactly surprising.

Is this automatically better than any feat in the book, for every or even for most rogue builds? ... Probably not, but boosting dex is going to be very high on the list for any dex-primary melee character.

Mitchellnotes
2015-01-05, 01:20 PM
Just like in every other RPG, of course :smalltongue:

That said, while you make a good point about dexterity, what you just wrote can basically only apply to dexterity, because it applies to both your to-hit roll and your armor class. It's the attacks and counterattacks that add up; all the others (init, skills, ref saves) aren't nearly as common. Dexterity is basically the god stat in any game that has a dexterity stat, so that's not exactly surprising.

Is this automatically better than any feat in the book, for every or even for most rogue builds? ... Probably not, but boosting dex is going to be very high on the list for any dex-primary melee character.

Nowhere has anyone else been talking about ASIs being better than feats in an absolute sense or vice versa. Stella, and others, have been trying to point out that taking a feat over an ASI should not be an absolute either. Only you have really been saying anything about an absolute mindset of ASIs vs feats. In addition, you made the claim that "math" supported this decision, which again, people have been trying to point out is a bit more complex than +1=5%. While correct, it isnt the whole story.

As I mentioned before, if you feel you've had this conversation before, it's likely that it is because how you choose to frame your argument.

Person_Man
2015-01-05, 03:07 PM
Just like in every other RPG, of course :smalltongue:

That said, while you make a good point about dexterity, what you just wrote can basically only apply to dexterity, because it applies to both your to-hit roll and your armor class. It's the attacks and counterattacks that add up; all the others (init, skills, ref saves) aren't nearly as common. Dexterity is basically the god stat in any game that has a dexterity stat, so that's not exactly surprising.

Is this automatically better than any feat in the book, for every or even for most rogue builds? ... Probably not, but boosting dex is going to be very high on the list for any dex-primary melee character.

I think you make a fair point here. This could have been addressed by the game designers, but they preferred to place a higher priority on tradition instead of game balance, which is understandable given their post 4E market position.

In my ideal version of 5E:

You could only generate your ability scores with a Point Buy or Standard Array, not randomly. DM can adjust the numbers up or down as preferred. But a player is never screwed or given an major advantage forever because of a small number of rolls.
There would be no racial modifiers to ability scores, which strongly encourages certain race/class combinations and discourages others.
Your ability score bonus would equal your Ability Score - 10. So having an 18 Strength would give you a +8 bonus.
No one started with an ability score higher then 15 (+5) at 1st level.
Each of the six ability scores would be more balanced in terms of their uses. You could move Paralyze/Hold saves to Strength, mental Saves to Intelligence, Initiative to Wisdom, bonus Inspiration points for Charisma, whatever.
There would be no X to Y abilities. If you want to be good at X, you must invest in the ability score that modifies X.
There would be no magic items that grant ability score increases.
You would still gain +2 to one ability score (or 1 Feat, or +1 to two ability scores), up to a maximum of 18.
Feats would be designed to be worth roughly +2 to any ability score.


I'm hoping that 5.5 or 6E implements these changes in the next 4-6ish years.

archaeo
2015-01-05, 03:19 PM
snip

Those all sound kind of like things you could trivially implement today, except for the feat part, if you feel that most feats are worth more or less than a +2.

As an aside, Mearls talked in a podcast interview I listened to about the fact that the design team mulled over doing your ability score modifier paradigm in 5e. I forget why he said they decided against it, but I feel like it was more or less "it's bounded accuracy." It basically gives you a more granular modifier in return for inflating dice rolls.

I also think 4-6 years seems unlikely; if 5e is doing so poorly in four years that WotC has to goose the market with a new edition, I don't know why they'd bother. Let's wait and see how Mearls' evergreen edition goes.

Kurald Galain
2015-01-05, 04:07 PM
Let's wait and see how Mearls' evergreen edition goes.

Let's hope it does better than his first evergreen edition :smallbiggrin:

Person_Man
2015-01-05, 04:45 PM
Those all sound kind of like things you could trivially implement today, except for the feat part, if you feel that most feats are worth more or less than a +2.

As an aside, Mearls talked in a podcast interview I listened to about the fact that the design team mulled over doing your ability score modifier paradigm in 5e. I forget why he said they decided against it, but I feel like it was more or less "it's bounded accuracy." It basically gives you a more granular modifier in return for inflating dice rolls.

I also think 4-6 years seems unlikely; if 5e is doing so poorly in four years that WotC has to goose the market with a new edition, I don't know why they'd bother. Let's wait and see how Mearls' evergreen edition goes.


Here is a post that I made on this forum in September 2009 (http://www.giantitp.com/forums/archive/index.php/t-125970.html):


Editions of Dungeons & Dragons (http://en.wikipedia.org/wiki/Editions_of_Dungeons_%26_Dragons)

1974:White box edition
1977: 1st edition
1989: 2nd edition
2000: 3rd edition
2003: 3.5 edition
2008: 4th edition
2011: 4.5 edition (projected)
2016: 5th edition (projected)

So, within about a year or two of releasing 3rd edition, WotC started working on 3.5 (even while releasing a ton of 3.0 material). Within a year or two after that, they started working on 4th ed (again, while releasing a ton of 3.5 material, and not hinting about 4e, lest they screw their profits for 3.5 stuff).

Over a year has passed since 4E was released. So I'm thinking that some sort of 4.5 edition is currently in the works. They've denied it, and definitely won't call it 4.5 for fear of more fan outrage. But it's inevitable that they'll release something in the next couple of years that modifies 4E rules based on what WotC has learned from playtesting, market research, and what supplements sold well. They'll be a free online update, but the core books will also be re-released in some new form.

Not long after that, someone within WotC will begin work on 5E. They would be idiots not to, because eventually 4E sales will decline, and they'll need to do something new and big in order to jump start them again.

So assuming that 4.5 will occur and will stay true to the core mechanics of 4E (similar to how 3.5 was basically 3.0 with some fixes) what changes do you think 4E needs?

And more importantly, what do you think the big reboot for 5E will look like? I'm guessing that it will have to be a big departure from the previous rules set, and not just another set of tweaks (which wouldn't sell well). Will they go back to the past and re-create a previous edition? Are there any 4E supplements that created new and interesting game mechanics that 5E could be based off of, as 4E was based off of Tome of Battle? Or perhaps another company that they could buy (or rob the good ideas from) and then re-cast as the next edition of D&D?


You'll note that D&D Essentials (4.5E) came out in 2010 (I projected 2011), and 5E came out in 2014 (I projected 2016). So I think I'm very safe in projecting 5.5 and/or 6E within the next 4-6ish years. Although it is not determinative, the most reliable indicator of future behavior is past behavior.

archaeo
2015-01-05, 05:20 PM
You'll note that D&D Essentials (4.5E) came out in 2010 (I projected 2011), and 5E came out in 2014 (I projected 2016).

Yes, truly, you are Nostradamus.


So I think I'm very safe in projecting 5.5 and/or 6E within the next 4-6ish years. Although it is not determinative, the most reliable indicator of future behavior is past behavior.

Except that we've been told, repeatedly, both explicitly and implicitly, that the business plan has changed. Mike Mearls isn't just a goofy looking guy who writes elf games, he's ostensibly the head of the department. He sits at the meetings where they make decisions about this junk. As far as he's concerned, which means as far as Hasbro is concerned, they're going to try and avoid the boom-to-bust cycle of previous editions by changing the business plan altogether. As far as "5.5" goes, the plan, as I understand it, is to just keep that revision process moving continually, so that errata and clarification is happening annually instead of as some halfway refresher.

But who knows. We'll see in 4-6ish years, when I'll cheerfully eat crow if I'm super wrong. I just won't be too surprised if they break the cycle, even if "breaking the cycle" herein means "D&D crashes and burns and it stops being an actively developed thing."

Xetheral
2015-01-05, 05:40 PM
If your example stipulates that the value of the +1 bonus "does not depend on the proportional increase in chance to hit" then your example is flawed...

There was no stipulation... the example was intended to demonstrate an analogous situation where the proportional change in chance of winning doesn't matter.


...all analysis of the value of a given +1 depends entirely upon the values of the chance to hit both before and after the +1 is applied.

I agree that the before-and-after values are what matter. Where we disagree is that you're focusing only on the proportional change, whereas I'm claiming the absolute change is equally valid, and also a more useful measure in this context.



Furthermore, because the +1 will be used across a variety of DCs, evaluating the impact of the proportional increase is much more complicated. (Either we'd need to arbitrarily pick a "sample" DC for optimization purposes, or we have to consider the distribution of the expected DCs.)

This is far easier than you propose. I have shown the math for making these calculations, and it isn't any harder than the average 6th grade student should be able to calculate.

A sixth-grade student could indeed pick a sample DC, and a talented one might even be able to do a weighted average. However, there are still arbitrary choices to be made regarding what data to use to perform the weighted average. Even with good modeling choices (which I would not expect of a sixth-grader), a weighted average would still lose the distribution data itself. The idea of keeping the distribution data intact and using that set to perform the calculations is well beyond sixth grade math, although a sixth grader might have sufficient spreadsheet skills to get a computer to do similar calculations (assuming the chosen distribution set itself has only a single dimension). By way of contrast, because the absolute increase of .05 is static, one doesn't need a spreadsheet to consider its impact.


The only issue is with disassociating peoples perceptions (eg. +1 on a d20 is a 5% increase!) with the actual impact that +1 provides.

I don't believe anyone in this thread is actually arguing that a +1 on a d20 is a 5% increase. From context it seems clear to me that all such references are using casual terminology to refer to a 5 percentage point increase.


And yet again, Selkirk provided a link to an independent blog which demonstrates the exact mathematical truisms which I am attempting to demonstrate here. Please go read that blog if you feel that I have the math wrong.

Your math isn't wrong. What's incorrect is your belief that only the proportional increase has any analytical value. (And yes, I've read the blog post. It doesn't say that treating a +1 as an increase of 5 percentage points is wrong... quite the contrary, it refers (regrettably using imprecise language) to this as "absolutely true". The blog merely emphasizes not to assume that the absolute and proportional changes are identical, refuting a standpoint that no one here is making.)


The fact that you find it easier to use incorrect math to weigh the relative values of one feat against another (to include the stat boost a feat may provide) is irrelevant. Math does not rely upon your ease of use in order to be accurate.

My math is not incorrect: you agreed with me earlier that a +1 bonus is an absolute increase of .05 in one's chance to hit. You're welcome to disagree with me about the relative analytical value of considering proportional vs absolute bonuses, but when considering analytical value, ease-of-use is normally of utmost importance.
Stella, my point-by-point reply to your last post is in the spoiler above. I spoilered it for brevity, and because it seems clear to me that the current tack of the discussion is leading nowhere. So let me try a different approach.

As I understand it, you believe that the proportional increase in chance to hit from a +1 bonus has analytical value, but that the absolute increase in chance to hit from a +1 bonus does not. Could you please explain why?

Mitchellnotes
2015-01-05, 09:13 PM
As I understand it, you believe that the proportional increase in chance to hit from a +1 bonus has analytical value, but that the absolute increase in chance to hit from a +1 bonus does not. Could you please explain why?

Reading what many people are posting, it seems like the opposite is happening. Lots of people are minimizing the proportional increase due to using the absolute increase. This has been to the degree of saying that the proportional increase is incorrect. I dont think Stella or anyone is saying that the absolute increase is wrong, just that the +1 is more complicated then just saying 5%. While not wrong, its not the whole picture.
Furthermore, the argument has been spurred on both by claims that the proportional math is incorrect and that "math" supported not taking an increase, due to the minization of impact by focusing only on the absolute and not proportional increase. (Thanks for those terms btw, im not really a math person and was struggling to find descriptors)

Clearly stat increases are important. Im going to guess that even people who would choose to seek increases after feats would still encourage starting with 16. There are also a lot of people who would favor increasing stats before getting a feat, which is sad due to the way feats flesh out a character both mechanically and role playing wise. The intent here was to see how best to make that decision easier by providing a bonus feat in a balanced way without having to resort to variant human. There were even a few great suggestions on the first page before we took a contentious tangent.

In terms of absolute vs oroportional, cue Futurama technically correct meme for both sides

MeeposFire
2015-01-06, 02:17 AM
Just as an example of how you have to be careful using proportional increases can be misleading think if you had a penny ($.01) and then you get a 100% increase. Well now you have 2 cents is that a lot of money? Now if you have $1,000,000 and you increased by 5% you will increase your wealth by $50,000. In this case the 5% is better than the 100%. Context is important when discussing proportional increases because if the value of the initial is small then even a large increase may not mean much.

So the question is whether the value of that +1 and its possible large proportional increase gives you more benefit than anything else you can have.

Kurald Galain
2015-01-06, 02:52 AM
So the question is whether the value of that +1 and its possible large proportional increase gives you more benefit than anything else you can have.

Precisely.

People who are arguing that +1 may represent a huge percent increase are missing the point, which is that we're trying to compare feats and ASIs. Taking one of them in isolation and proclaiming "this has a benefit" doesn't say anything about whether something else may have a more important benefit.

Person_Man
2015-01-06, 11:10 AM
Except that we've been told, repeatedly, both explicitly and implicitly, that the business plan has changed. Mike Mearls isn't just a goofy looking guy who writes elf games, he's ostensibly the head of the department. He sits at the meetings where they make decisions about this junk. As far as he's concerned, which means as far as Hasbro is concerned, they're going to try and avoid the boom-to-bust cycle of previous editions by changing the business plan altogether. As far as "5.5" goes, the plan, as I understand it, is to just keep that revision process moving continually, so that errata and clarification is happening annually instead of as some halfway refresher.

But who knows. We'll see in 4-6ish years, when I'll cheerfully eat crow if I'm super wrong. I just won't be too surprised if they break the cycle, even if "breaking the cycle" herein means "D&D crashes and burns and it stops being an actively developed thing."

So just to provide a little context, when 3E first came out Ryan Dancey and WotC made the exact same claims about it being an "evergreen" edition. And I have no doubt that they very strongly believed it at the time. But profits declined, people got fired, and eventually we got 4E.

When 5E profits decline (and eventually, they will, as they do for every luxury product in a capitalist system) Mike Mearls will be asked to leave by Hasbro, some new guy is going to come in, and that new guy is going to start working on a new edition to boost profits. Or if D&D isn't profitable enough for Hasbro it'll be sold off. That doesn't make Mike a liar. It's just the way the business works.

Stella
2015-01-06, 12:45 PM
Just as an example of how you have to be careful using proportional increases can be misleading think if you had a penny ($.01) and then you get a 100% increase. Well now you have 2 cents is that a lot of money? Now if you have $1,000,000 and you increased by 5% you will increase your wealth by $50,000. In this case the 5% is better than the 100%. Context is important when discussing proportional increases because if the value of the initial is small then even a large increase may not mean much.
Context is important, such as the context of providing an example which is actually relevant to the discussion.

I've already explained how this example is not a relevant one. Most people understand that +3 is better than +1, after all. That is covered in first grade math. Even before first grade if you ask a child if they would prefer 3 cookies or 1 cookie, just showing them each pile in a hand and allowing the child to select the hand, you can be fairly confident that they will select the greater number of cookies.

However, it is slightly more difficult to understand which gives you the most improvement in expected damage: +1 to hit or a feat such as Improved Critical. To understand which is better you must first be capable of understanding how to calculate the proportional increase each option gives you.

archaeo
2015-01-06, 01:36 PM
So just to provide a little context, when 3E first came out Ryan Dancey and WotC made the exact same claims about it being an "evergreen" edition. And I have no doubt that they very strongly believed it at the time. But profits declined, people got fired, and eventually we got 4E.

When 5E profits decline (and eventually, they will, as they do for every luxury product in a capitalist system) Mike Mearls will be asked to leave by Hasbro, some new guy is going to come in, and that new guy is going to start working on a new edition to boost profits. Or if D&D isn't profitable enough for Hasbro it'll be sold off. That doesn't make Mike a liar. It's just the way the business works.

I don't disagree with this, Person_Man, I just tend to think it's a big mistake for WotC to treat "rules" as a "luxury product," or to treat the ruleset like a car that needs to be updated every few cycles. Instead, if you can invest in a single ruleset, you can begin to focus on D&D-the-brand instead of D&D-the-game, and the brand is a far more valuable thing to Hasbro than the game.

But, like I said, we'll see what's gonna happen. I'll cheerfully admit that I'm not 100% confident in my expectations, and have a feeling you're more right than wrong.

Kurald Galain
2015-01-06, 02:59 PM
So just to provide a little context, when 3E first came out Ryan Dancey and WotC made the exact same claims about it being an "evergreen" edition.

Paizo thinks it still is, though :smalltongue:

Dalebert
2015-01-06, 03:36 PM
Even before first grade if you ask a child if they would prefer 3 cookies or 1 cookie, just showing them each pile in a hand and allowing the child to select the hand, you can be fairly confident that they will select the greater number of cookies.

Thank you! You just gave me an idea for an analogy.

Let's say a kid has a chance to get a cookie of his choice every day at school--butter or chocolate chip. Some kids have saved up their cookies or traded baseball cards for cookies and other stuph so one kid has 4 chocolate chip cookies and 1 butter cookie. What criteria should the kid use to decide? Which choice is mathematically better just in terms of resources/investment/whatever?

Picking the butter cookie would represent a 100% increase in his butter cookies while picking a chocolate chip cookie would represent only a 25% increase in his chocolate chip cookies. This statement is undeniable mathematical TRUTH! It's also fairly mathematically irrelevant. It's just a game of semantics. It's a linear increase of one cookie. All that matters is whether the kid prefers one more chocolate chip or one more butter at this moment.

Of course, maybe he wants to have roughly the same of each so he can alternate and they'll be even. In that specific case, then picking which gives the higher percent increase actually becomes relevant. But either way, it's a linear increase of one cookie no matter how many cookies he already has. If the kid was somehow offered more cookies based on how many he already has, that would also make it matter, but that would not be a linear increase; it would be proportional and no longer analogous to what we're talking about.

Just trying to point out that people are not disagreeing with what you think they are. They're accepting your premises but denying their relevance to the decision.

Fenix_of_Doom
2015-01-06, 03:40 PM
However, it is slightly more difficult to understand which gives you the most improvement in expected damage: +1 to hit or a feat such as Improved Critical. To understand which is better you must first be capable of understanding how to calculate the proportional increase each option gives you.

And that works great for DPS(or DPR?), but I'm curious as to what would make of the following case:
You are playing 3.5 and as you level up you get to invest a single skill point, you have to choose between two relevant skills:
SKill 1: you've got an ability bonus of +4 in this skill and you've already invest 6 skill ranks
Skill 2: you've got an ability bonus of +2 in this skill but you've only invested 1 skill rank

Assume DC's to have and average of 15, with an approximately Gaussian spread starting at 10 and ending at 20.


Into which skill should I put my only skill point?

Person_Man
2015-01-06, 07:41 PM
I don't disagree with this, Person_Man, I just tend to think it's a big mistake for WotC to treat "rules" as a "luxury product," or to treat the ruleset like a car that needs to be updated every few cycles. Instead, if you can invest in a single ruleset, you can begin to focus on D&D-the-brand instead of D&D-the-game, and the brand is a far more valuable thing to Hasbro than the game.

But, like I said, we'll see what's gonna happen. I'll cheerfully admit that I'm not 100% confident in my expectations, and have a feeling you're more right than wrong.

I agree with you entirely. If I was marketing director of WotC, I would have one iconic character for each class and a multi-verse campaign setting for them to journey through. Those characters would have books, a cartoon, video games, a comic book, toys, personal social media accounts, adventure modules, etc. The real money is in having a successful D&D brand. I would give the core rulebooks away for free digitally just to expand the brand.

Of course, I'd probably have a hard time convincing my supervisor at Hasbro to forgo $10-20 million dollars worth of book sales for the potential of much more in related brand sales.

archaeo
2015-01-06, 08:14 PM
I agree with you entirely. If I was marketing director of WotC, I would have one iconic character for each class and a multi-verse campaign setting for them to journey through. Those characters would have books, a cartoon, video games, a comic book, toys, personal social media accounts, adventure modules, etc. The real money is in having a successful D&D brand. I would give the core rulebooks away for free digitally just to expand the brand.

Of course, I'd probably have a hard time convincing my supervisor at Hasbro to forgo $10-20 million dollars worth of book sales for the potential of much more in related brand sales.

This doesn't seem like a terrible idea, but then you're using the game to "establish" the brand, insofar as you need the game's iconic characters and campaign setting to be successful via the game before it can be successful in a wider market.

I think Hasbro's licensing department understands that brands can easily be totally disassociated from their original products. Hell, they probably made good money on Battleship, after all. And just last week, I saw a Dungeons & Dragons-branded slot machine.

In any case, we're way, way off topic, and I'd hate to crowd out this, uh, interesting conversation about the merits of various mathematical approaches to optimization.

MeeposFire
2015-01-06, 08:18 PM
Context is important, such as the context of providing an example which is actually relevant to the discussion.

I've already explained how this example is not a relevant one. Most people understand that +3 is better than +1, after all. That is covered in first grade math. Even before first grade if you ask a child if they would prefer 3 cookies or 1 cookie, just showing them each pile in a hand and allowing the child to select the hand, you can be fairly confident that they will select the greater number of cookies.

However, it is slightly more difficult to understand which gives you the most improvement in expected damage: +1 to hit or a feat such as Improved Critical. To understand which is better you must first be capable of understanding how to calculate the proportional increase each option gives you.

And in fact I can and so far you have given nothing that makes me think that the ASI is worth more in general. Right now you have not shown that the ASI is the big money rather than the penny. All you do is keep saying is the proportional increase as if that means anything on its own. It does not. You need to show that the benefit is actually worth something. 70% increase of essentially nothing would still be essentially nothing.

You need to show that the value of a +1 mod to a given stat is worth a lot on its own. Once you establish that it has value then we can determine with the proportional increase actually means. Using money again (because most can understand that easily) let us say a feat is worth +$3 and that if you chose to get an ASI it would give a proportional increase to your ability score value of 50% (a large value I think). Now let us pretend that your ability score bonus right now is worth $4. In this case after increasing the value by the amount given you are now worth $6 (50% increase of 4 is a total of 6) which is less than the value gained by taking the feat of $7 (4 base plus the value of the feat). Now if the ability score is rated at $6 then the math changes as now a 50% increase by taking the ASI now gives you a value of $9.

You need to establish that the ASI has a value big enough that the proportional value increase is actually worth more than the value increase from the feat.

GoodbyeSoberDay
2015-01-06, 08:45 PM
However, it is slightly more difficult to understand which gives you the most improvement in expected damage: +1 to hit or a feat such as Improved Critical. To understand which is better you must first be capable of understanding how to calculate the proportional increase each option gives you.Actually, you just have to calculate the expected damage with +1 to hit and compare it to the expected damage with improved critical. You can then calculate the proportional damage increase if you want; the result of which one is better won't change, since the base damage is the same.

To take the example further, I assume we're talking about the Champion Fighter's crit feature, and bounded accuracy holds. With a base chance to hit c, average weapon die damage d, and bonus damage b, we can calculate the values:Base damage = c(d+b) + 0.05d
"+1 to hit" damage = (c+0.05)(d+b) + 0.05d = Base damage + 0.05(d+b)
"Imp. Crit." damage = c(d+b) + 0.1d = Base damage + 0.05d

And as we can see, without any proportional anything, a +1 to hit provides greater expected damage than the Champion Fighter's improved critical feature.To put it another way, the fact that a +1 to hit and damage represents somewhere around a 20% increase in damage (depending on the base values) makes it look bigger, but it doesn't mean it's actually bigger. Polearm Master can represent over a 100% increase in damage. The key is that the base damage is the same in either case, so if you're comparing DPR proportionally rather than absolutely the differences will merely increase proportionally.