If you work in any form of engineering then this is probably a familiar idea. I just want to talk about how valuable I find it to be when it comes to RPG design. I've always really liked that the standard in RPGs is to have new "editions", rather than straight-up sequels. And because it is, to greatly generalize, a fairly scrappy and accessible hobby, we get to do lots of communal collaboration. We build on each others' work. We actively encourage the theft of good ideas (within the bounds of intellectual property rights). Most RPGs list their "Rule 0" as being something along the lines of "the GM can and should ignore or change any part of the game they want to if they judge it best for their group." It's like you have a game designer at every table.
The problem is that a lot of folks are pretty amateur as game designers. The single biggest failing, I think, comes from this very gap: not enough would-be designers are engaging with iterative design.
You look at what's come before and you use it as a basis for what you'll create anew. You examine the previous version to understand its design, paying attention to the context which created it and asking yourself whether or not those same factors remain relevant. And at the very least, the common corollary to that rule 0 is this: "a good GM will first make an effort to understand the original rule's purpose before deciding to change it." All-too-often ignored wisdom.
I especially find this to be common in two cases: 1) people complaining about design they don't understand, and 2) people making poorly thought-out houserules. Let's talk about some examples.
Attacks and Hitting Stuff
This is something that you can do in every edition of D&D but it pretty much always changes whenever they make a new one. It's funny because you can look at the method of attacking in every new edition and nearly always think, "wow, this is clearly an improvement. In fact, how didn't they come up with this before? How on Earth wasn't this how we were always doing it?" That's certainly what I thought when 5E came out and I compared it to 3E. And that same dismissal of 3E's method is still how most people talk about THAC0, one of the enduring jokes of D&D history.
But they all make sense when you look at them in the context of their time. They were all the product of somebody looking at the previous iteration, and either thinking about how it could be improved or not thinking about it. Let's go through them one-by-one (a good deal of this section will be taken from this legendary Reddit post by u/gradenko_2000).
In OD&D and other early editions, your basic attack roll relied on an "attack matrix" that each character had. From the post:
In the beginning, Gary Gygax played wargames. In wargames, you would have something like an Attack value and a Defense value. You would also have a table on the game's rulebook: If attacker's attack value is x, and the defender's defense value is y, you roll a die and cross-reference the result against the chart (attack values on the x-axis, defense values on the y-axis) to see if you scored a hit.
Specifically, he played naval wargames. The term Armor Class refers to ships: how thick, and how well-covered the ship was in armor plates. An AC of 1 was very good: it meant first-class armor. AC 2 meant second-class, and so on, such that a higher numerical value for AC meant that the protection granted by the armor was worse, and so it was easier to score a damaging hit against the ship.
When Gygax and his contemporaries were finally writing/designing D&D, they carried over this habit:
So if you were a level 2 fighter, and you were attacking a target with AC 9, then you cross-reference the table and see that the intersection of "level 2" and "AC 9" is "10." Meaning that a level 2 Fighter needs to roll a 10 or better on their d20 to hit a target with AC 9. Continuing from the post:
The expectation was that you'd write down the number that you needed to roll to hit various targets of different ACs, like so:
There wasn't even math involved - you'd roll your die, compare it to the AC of your target (either you ask the DM or they declare it beforehand) and you'd know right then and there if you scored a hit or not. If you had an attack bonus from STR and/or from a +1 weapon, you'd either factor it in to the list of numbers you wrote, or you added it in your head after rolling the d20 (okay, a little math was involved)
The thing is though, this system works well when you're playing with warships: the attack value of the USS Iowa isn't ever going to change, and neither is the AC of the Bismarck, but in D&D, if your target number keeps shifting because you gained then lost Bless, or you're attacking with a bow instead of a sword, or you're using a sword that you're specialized in versus a polearm that you're not, then using a chart or a list of target numbers can become confusing or tedious.
So you can see how the original attack system was, well, weird and unintuitive... but you understand why they used it when you find out its origins. It was carried over from a previous iteration (that is to say, another game entirely) and it made sense for that version of things (since, as the poster says, the attack value of naval warships would be static).
This is an example of poor iterative design: previous elements being carried forward when they shouldn't be. It's an example of a designer failing to ask whether or not the mechanic they are using is actually right for their purpose or if it's just something they're used to. So that brings us to THAC0, which was the new system used in AD&D 2nd Edition.
So the story goes that there were Computer Science students that played D&D a lot in the 80s and they came up with an idea: if they could make a formula to capture the progression of the table, then they wouldn't need a chart, and any adjustments due to STR or whatever would just be +1s and -1s to the formula.
That's where THAC0 comes from. It means To Hit AC 0. Let's go back to the chart I posted above:
- X-Axis: a level 2 Fighter
- Y-Axis: a target with AC 0
- The intersection is 19, so a level 2 Fighter needs to roll a 19 or better to hit a target with AC 0
The way the formula works is: THAC0 - target's AC = roll needed to hit
So let's try that with the first example: A level 2 Fighter has a THAC0 of 19, and they're trying to hit a target with AC 9
- THAC0 - target's AC = roll needed to hit
- 19 - 9 = roll needed to hit
- 10 = roll needed to hit
- a level 2 Fighter needs to roll a 10 or better to hit a target with AC 9
And it matches. So instead of a big chart that covers 20 levels and 20 AC values, for every class, you just have something that looks like this:
And instead of 5, 10 or 20 lines in your character sheet about what you need to roll to hit a target, you just need one: Current THAC0, or as the AD&D 2e PHB recommended, one THAC0 number for every weapon combination
As you gain levels, your THAC0 becomes lower, making same-AC enemies easier to hit, because the required roll is lower.
As your enemies decrease their AC, then they become harder to hit, because the subtrahend in the THAC0 formula is smaller, meaning the final result is higher, meaning the required roll is higher.
This is why THAC0 deserves a lot of credit. I mean, for the love of God, how did it really take so long for somebody to look at the attack matrices and ask, "why don't we just write this out as a function?" It's literally 9th grade math. If all the matrix results adhere to a formula anyway, why not just use the formula? And look at that 2nd Edition THAC0 table above. See how much info they squeezed into such a small space? Previously you'd need a whole matrix for every separate class in the game like the very first table I embedded.
But the day came for 3rd Edition to be created and they needed to look at the state of things and... by golly, you just have to admit that THAC0 still isn't perfect.
And it was still a clumsy system: an attack bonus from STR or from a +1 weapon would reduce your THAC0, and if you were attacking a monster with negative AC, then, in line with basic algebra, [THAC0 - (- AC)] would turn into [THAC0 + AC], and since it was a subtraction operation, the order of the numbers always mattered.
So yeah, that's where they came up with the idea of the d20 system. Out of all the many, many forms of task resolution in AD&D, the one they thought to use as their jumping-off point was the attack mechanic, which they streamlined into "roll 1d20 + modifier to beat AC." And then they realized that they could use this as a universal mechanic for nearly everything in the game, further reinforcing it as an intuitive system. You roll 1 die for everything, you always want to roll high, and you always want higher bonuses. You want your own character to have a high AC so that enemies have to roll really high to hit you. It's just plain simple. Up and up and up.
You know what I really love about d20 mechanics? The probability is so easy to do in your head. Every 1 on the die represents 5% out of 100. So if you get a +3 bonus, that's a +15% chance to succeed. Adding and dropping +2s and -1s and +4s and whatever all over the place is a breeze!
...Until things got out of control. The dreaded "modifier dogpiling" set in. It bogs down even simple checks. Sure, it's just simple arithmetic, but it's still a pain in the ass. Even worse, the "DC treadmill" became a thing.
So the core modifiers to your basic rolls will increase as you level up, representing your gain in power. Your ability modifiers increase, your BAB goes up, you keep getting more skill points, you get magic weapons with bonuses, etc. The problem is that these add up fast. An unmodified d20 roll has a 50% chance of resulting in 11 or higher. Therefore, if you ask a party of NPC commoners to make a DC 11 check, you should expect 50% to succeed and 50% to fail. But even a level 1 PC has, like, a +5 on some of their stuff, so if you want them to have a 50/50 chance at something you need to set the DC to 16. And by the time they hit level 5, this shit has already become +10 or even as high as +15 on some stuff, if they're min-maxing. You have to start setting DCs at 20+ shockingly early if you want any chance of failure at actions. And I really mean any chance. It's simple mathematics. If you have a +15 modifier to a d20 roll then the minimum number you can possibly roll is 16, so any task that's DC 16 or below should be an automatic success. For you to retain that same range of 1-20 results, you're talking about... well, a possibility range of 16-35. And yeah, by the time you're level 10, you are regularly seeing DCs in the 40s.
This is fucking broken. Everything scales like crazy and it creates problems. The DM has trouble setting fair DCs for stuff. Sure, it's true that tasks should get easier as players level up, but the rate at which their bonuses increase means that tasks will be literally impossible to fail by, like, level 7... unless the DM starts correspondingly increasing the DCs to keep up the pace.
And then you run into weird narrative issues, like being able to explain and justify those increasing DCs in-universe. Sure, the monsters you fight are getting stronger and stronger. But why is the DC to pick this lock at 10th level around 40 whereas picking a lock at 1st level was a 10? Well, maybe it's some kind of master-craft dwarven mithril arcane super-lock with defensive runes, I guess. Are you ready to keep adding more difficulty justifications like this every single level?
Alright, so why is the DC to negotiate with a regular human guard around 40 when it was only 10 at first level? You wanna tell me that, by sheer coincidence, all of the guards that the players are talking to at this level are world-class debaters with several levels of bard each? Fuck off.
Plus, the disparity between an expert and an amateur gets ridiculous as well. Don't get me wrong, a rogue should always be the best lockpicker in the party, and a bard should be the best talker. But dear lord it shouldn't be mathematically impossible for the other PCs to succeed at those tasks if they try their hand at it. If you haven't been investing skill points in certain things by the time you are level 5, then you will, without exaggeration, literally never be able to do them again.
So then 4th Edition happened.
And then 5th Edition happened and they came up with this brilliant thing called "bounded accuracy." Don't get me wrong, there are certain things that definitely got left behind which some folks miss. But on the whole, I cannot imagine preferring the old way as a DM.
So they scrapped the simple arithmetic modifiers almost across the board. Very rarely is there anything that gives you a +2 this or a -3 that anymore. Instead, you can either get advantage or disadvantage, meaning that you roll 2d20 and you take either the better or the worse result. Here's some math if you'd like. The point is that, with advantage, you can increase the average outcome without actually shifting the range of possible results up past 20. The full range of all possible rolls are still "bound" within the 1-20 range of the die, it'll just skew them closer to the top. Same with disadvantage. For simplicity's sake, the game recommends that you treat advantage or disadvantage as functionally being "about plus or minus 5 to the roll" but obviously that doesn't really capture the true effect it has on the probability.
And the effects they do have are pretty huge. From that post: "There’s less than a 9% chance of rolling 15 or higher with disadvantage, whereas there’s a 30% chance normally and a 51% chance with advantage." The rules state that you cannot stack multiple advantages or disadvantages, but honestly you don't really need to. Once you've secured advantage then you know you've got it pretty good. In 3E there was a lot of instances where a PC would spend forever fishing for every single bonus they could find that might apply to the situation.So in the original Castlevania your basic attack was with a whip, and it sucked. It sucked on purpose. It takes several frames to wind up and it has a pretty short range and it can only hit directly in front of you. So then there are a bunch of special side weapons that you can acquire each level, and they each use up ammo. They all have a different range of where they hit, and are generally more flexible and useful than the whip. So, much of the skill in the game is optimizing the situations where you use your limited supply of good weapons, so you don't have to rely on your shitty whip.
What You Should Take Away From This
Honestly, nearly everything I've ever done with RPG design is some kind of modification. I don't make original things, I'm not good enough to do that. Advanced Darkness is a houserule for 5E light rules, and closely related are my 5E fumble rules. Brave is just a hack of Ben Milton's Knave, and various things within it are just iterative designs from other things. The Brave Enchiridion book that adds a class system to the game has, as its central standout feature, a "deed-based advancement system" rather than XP. But that was just based on a combination of Jeff Rients's "Carousing-for-XP" houserule and the Druid and Monk advancement rules from OD&D, which you can see me slowly figuring out in this ancient post. I mean, for fuck's sake, look at how much research I did just to make a procedure for my urban gameplay.
I believe that iterative design is a good place to start with game design because it's almost always easier to play something and think of ways it could be improved than it is to come up with something brand new. It does require that you should consult previous related work before trying to make something from nothing. Too many people try to reinvent the wheel, when they should be looking at other people's wheels for comparison. Especially if they discover that somebody out there has already invented a fucking hovercraft and even discussing wheels any further would be absurd.
But of course, you also shouldn't build on previous work too much, lest you carry forward certain elements that have no reason to be there. You have to think critically about the purpose of every element in the game and why it's there. I think that all the best updates D&D makes with each edition are the small things they realize they don't need to carry forward. I'm kinda praying that 6E does away with saving throws and the ability score/modifier distinction. But that's just me.
-Dwiz
* [EDIT] It was recently brought to my attention by Ava of Errant fame that Jeff originally got this from Dragon #10, in an article entitled Orgies, Inc. I've been using that table for years and had no idea. I don't think it detracts my point, but I probably won't be citing Jeff's blog post again in the future.
Well said.
ReplyDeleteGreat post. There are so many gem quotes in here I want to pull out.
ReplyDeleteYou and I both have whistled under our breath and thought "why weren't we doing this the entire time?" when encountering a good new piece of design from a new game edition.
You nailed the DC treadmill of 3E and articulated the benefits of 5E. (You notably skipped over some of the "whys" and "wherefores" of 4E -- elements of iterative design that absolutely solve the problems games like Pathfinder are MIRED in -- but I will forgive you for that).
"Likewise, I've many times seen people make entirely new sub-systems and procedures for games like 5E that aren't really cohesive with the rest of the game's design." I quite agree. There is one tonal thing I want to push back on. I see this sometimes in RPG design forums so I'm a little sensitive to it, but I sometimes see people attacking GM's attempts to hack a game. "You need to understand the game before you start hacking it!" they protest.
Maybe.
But hacking a game and having a few bad sessions is a *great* game design experience. Hacking is how you understand how games work. Knock out the support structures and watch the ceiling come down on your head. Hack things until they fall apart in your hands.
You know what
DeleteYou're damn right.
Lord knows I've made a lot of really dumb changes and mistakes along the way so yeah, you're right. No advice I give trumps the ultimate game design principle: play more and experiment more and just keep paying attention.
I agree that you can start hacking right away, but people often complain that a mechanic "is bad" in their process of explaining a hack. There's a lot of misinformed explanations that come out of this issue.
DeleteHack anything and everything! But it's tough to critique the original without trying it.