It’s a staple of the fantasy genre: faced with an obstacle the barbarian can’t punch his way through, the wizard flips through his spellbook until he finds the perfect spell. He reaches into his component pouch, withdrawing—somehow—exactly what he needs, then casts a powerful spell, surprising the heroes and allowing them to continue on.
Wizards don’t get to do that in 5e. They prepare so many spells per day out of their spellbook, and unless the other spells within are tagged ritual, they don’t get to see use until after the next long rest.
I want to house-rule that a wizard can cast unprepared spells from their book in the absence of exigent conditions. If I have time and space to crack out my spellbook, being disallowed from mage armor, disguise self, or jump without [8 minus sleep] hours of study feels arbitrary. What about the game changes, especially balance-wise, if wizards are allowed to cast unprepared spells from their spellbooks?
Note: this would be different from [ritual] spell casting from the spell book. In this proposed scheme, casting an unprepared spell would still require slots.
I’m a DM of an OOTA campaign where the party is considering eating Stool, the myconid from the prison. Considering the whole "don’t eat mushrooms you find in the woods" thing we’ve all been told, I felt it would be a good idea to give some kind of consequence. Maybe he’s poisonous, or some kind of LSD. Could I get some recommendations for what kind of effects they could get from eating Stool? I know this seems like a VERY open, opinion based question, but I’m looking for something with some research behind it. Maybe there’s a book or resource on Myconid biology? I need something researched and accurate to what would actually happen. Additionally, if it would change if Stool was cooked instead of eaten raw, please include that too.
Current table has a house rule that spells attacks can not deal extra damage through critical hits. They have the rule that you can critically succeed or fail on a natural 20 or 1 on saving throws.
I brought up how I was worried about how this would affect the balance of the game, fairly certain this would impair characters who are primarily casters in the long run. (Most of the PCs are half casters or not at all, save for mine and a couple others.)
I was told they are going with this ruling because it is more balanced, but I’m not convinced. Could someone with a more thorough understanding of the rules elaborate on how this could unbalance the game? Or am I just worried over nothing?
This is a pay to play table that I’ve already paid a subscription for so I’d like to not be told to just walk away as an answer. The DM has told me if there really is a balancing issue he’d see to fixing it.
2nd half of this question.
Follow up to this question.
So, as stated in that question, I feel that it is quite weird that a 3rd level paladin + 2nd level Ranger is not equivalent to a 5th level Half-caster (such as a 5th level Paladin), but weaker (being equivalent to a 4th level Paladin).
With that in mind, I intend to use the following multiclassing house-rule for determining the spell slots:
- Sum the levels of the half-casters first. So, in the example, 3 + 2 = 5.
- Divide by two. (Divide by three for Arcane Fighter/Rogue – both after summing them together as well).
- Round it to closest integers, rounding .5 up.
Obviously, this only applies to classes that actually have the spellcasting feature, i.e., the Paladin and Ranger should be at least 2nd level, and the Fighter or Rogue should be at least 3rd level.
Such an idea is not novel and already appears in the Artificer, which is explicitly described as having its half-caster levels being rounded up.
From my understanding, this house-rule will mirror the behavior of single class spellcasting of half-casters and third-casters more closely (not entirely – rounding up would mirror it perfectly). Is there any weird edge case that I am missing that would make this house-rule imbalanced in any way?
The only reason I round to nearest integer rather than directly rounding up is that a 4th level Arcane Fighter would contribute as much to the spellcasting as a 4th level half-caster. Although this is what happens in single class, my gut feeling was that this would make dipping 4 levels in a Fighter, for example, be considerably stronger than before, since specifically 4th level also includes an ASI.
I am running a homebrew tier 3 adventure in which the players will eventually encounter vampires. I’d like to introduce an item that will be particularly effective against vampires and vampire spawn, without it being obvious that this is the case. In addition, I’m dissatisfied with the efficiency of poisons. Basic poison, in particular, seems very weak for its cost. More expensive poisons don’t seem to scale well either.
I drafted a magic weapon that I believe handles both these problems. The intent is that this weapon appears to be a way to make certain consumables more effective, but in reality is intended to be of superior use against vampires in particular, due to its ability to deliver holy water.
Here is what I drafted:
Infused Yew Dagger
Weapon (dagger), Uncommon
This wooden blade is porous and stained multiple faded colors. You can apply to the blade any fluid which can be spread on a weapon (such as Poison) or flung (such as Oil,) infusing the blade with that substance.
For 1 hour after application, the blade gains the following property:
The first time you hit a creature with the blade each turn, the effect of the most recently infused substance is applied to the creature in addition to the normal piercing damage (that creature can make any relevant saving throws allowed by the infused substance.) If the substance would normally lose its potency after its effect is applied, it does not.
Whenever the blade is unprotected and takes any amount of acid or fire damage, it is rendered useless.
The specific question that I want to be answered is: Does this weapon do anything overly powerful beyond what I intend it to?
For the purpose of this question, we can ignore whether the rarity is correct, as well as whether or not poisons are actually too weak.
My specific intent with this weapon is the following:
- Since it is a dagger, it will not be the usual weapon of choice outside of its niche.
- Since it is wooden, it can be used as a stake for vampire hunting.
- It is a particularly effective way of delivering holy water.
- It does not work more than once per turn, and should therefore not be too abusive given the typically short length of 5e’s combat.
- It allows poisons to be used for longer than their restrictive one minute/one use clauses, while still allowing a save for the more powerful poisons.
- The last clause does not support the less expensive acid option, which I believe would be too strong for the cost.
- The last clause does not support alchemist’s fire, which I feel would be strange for a wooden weapon.
Does this weapon have other severe balance implications, or other abusable options that I have not considered?
Watching a video stream of a game, one of the players, the Bard, asked if he could inspire himself with his Bardic Inspiration feature. The GM objected and said that the feature specifically said that you must target another target with your Bardic Inspiration. I thought this was strange and when I looked it up, he was right. In the end, the GM let the Bard inspire himself and he used the Inspiration die to succeed on an ability check.
I find it weird that Bards can’t inspire themselves and I can’t find a mechanical reason why they should not be able to; they are still expending a resource, anyway. Thankfully this hasn’t come up in our games as the Bard is perfectly happy inspiring others while playing a folk metal tune on his phone. But if it does come up, I’m inclined to say why not?
Is there a mechanical consequence I’m potentially missing if I allow a Bard to Inspire himself?
Would that break the class?
If I create a scroll with the Wish spell and an other person uses it to do anything that would trigger the penalty (anything that is not replicating an 8th or lower level spell causes necrotic damage later and has a 33% chance of not being able to cast the spell ever again) who suffers the penalty? Wouldn’t the scroll-user technically not be the one casting the spell, given that it was stored in a scroll?
I know it’s a silly scenario but I wolud like to know if it’s relevant and maybe there are some sort of theory/studies on it or it’s merely a non-sense situation.
Immagine a language much close to a Turing machine that is compiled in an higher language (let’s say C). This language can accept an integer as input, but using his syntax i don’t have any access to his internal rappresentation of this integer, that is the integer wolud be located in one single cell and the only operation I can do on a cell are:
+1, -1, copy/paste from a register, set to 0, chek if it's non-zero.
Now if I want to output the sign of the integer I would create a copy of it so that in one cell the value gets incremented and in the other gets decremented. Until I reach 0 and then I would stop and correctly tell the sign.
The interesting part is that under the hood, this program would surely use the information of the sign of the integer to execute +1 and -1 correctlty…
So I’m using operations that would use information x to obtain information x… Normally I would have access to the rappresentation of the integers and to check the sign consists in looking at one particular bit. But in this case some information is hidden from me.
It reminds me of solving the limit sin(x)/x as x goes to 0 by de l’Hopital rule… It’s a logical fallacy because that limit is required in order to derivate sin(x)
Could there be situations when this is not generated explicitly by us and it’s inevitable? Has it more profound consequences?
Last night, I DMed a game with only two players, and neither of them had any prior experience with D&D. All in all, it went great, but towards the end, they got into some serious trouble and the druid seemed somewhat surprised by the fact that she had used all of her spell slots. I didn’t want to punish a beginner too hard and the only alternatives appeared to be a near inevitable TPK or some cheesy deus ex machina, so I told her she could try to cast Healing Word despite having used all of her magic power for the day. I let her make a CON save to decide how she could handle the enormous stress of stretching her abilities to such an extend. She rolled quite high, but not extra-ordinarily high, so I decided that she could indeed successfully cast the spell, but that it might backfire later in some way. I haven’t decided the specifics yet and in order to keep it interesting but balanced, I am looking for something similar to this in any official source book.
I’m aware that I’m well into homebrew territory with that ruling, and that this is not the right site to ask for inspiration. This is why I am specifically asking the following:
Is there any class or racial feature or any item that allows a character with no remaining spell slots to cast a spell of level 1 or higher at the cost of some negative consequence (e.g. taking a level of exhaustion)?
I am not asking for general ways to simply cast spells without expiring a spell slot. There has to be some immediate trade-off. Taking 18 wizard levels in order to gain access to Spell Mastery can of course be seen as quite a trade-off, but I hope it is obvious that this is not what I am looking for.
A discussion of handling of mechanics for characters who quickly regenerate their wounds resulted in an idea of implementing regeneration by changing the type and/or number of Consequences a regenerating character has: if it has multiple Mild Consequences and no Severe (or even Moderate) ones, this results in the system naturally producing mechanical effects that closely match what happens in the fiction – character genuinely getting wounded, but being at full or near-full health in a few scenes.
This seems like an elegant ways of making the pacing and mechanics follow the fiction. However, it raises one big question: what is a fair exchange rate between Consequences of different severity levels? Or, alternatively, how much is a Consequence of a given severity worth in terms of other ‘currencies’ of the game?
Before you try to frame-challenge: The answer may be academic for a party that all consists of immortals/werewolves/T-1000s/trolls, but is of interest for a more varied party where some PCs would have such traits and others wouldn’t. It’s certainly a concept that comes up from time to time and that I’ve seen discarded for reasons of lack of confidence about the mechanical utility of such benefits.
Prior research and factors to consider:
It seems that the relative utility of a Mild Consequence hovers around a level of a bit less than 1 Refresh (or 1 Stunt), maybe half of one:
- In general, a Mild Consequence is about as good as one Invocation for preventing a Take-Out, but it comes with one free Invocation for an enemy. It is unambiguously worse than Armour:2 (which is often priced at a Stunt), since it’s only used once per Conflict.
- In a campaign with about 5+ scenes and about a couple or more appropriate-type Conflicts per Minor Milestone, it’s plausible to maybe fill and recover a couple of Mild Consequences per Minor Milestone, but that requires lucky timing. (In my experience that never happened even once so far. On the contrary, Conflicts seem rare in games I’ve been in, even in action-oriented games.)
- A Consequence can be of use as a Success at a Cost and some other similar applications, but those seem rare.
However, even taking that in mind, I find it harder to compare the value of Consequence of other severity:
- they act to absorb more Shifts…
- …while still only providing 1 Free Invocation to the enemy;
- but they are significantly slower to recover from, meaning you can only use them rather rarely.
I’ve seen opinions going in both directions about whether Moderate and Severe Consequences are more or less powerful than Mild ones (and used to hold the latter opinion, but by now am no longer confident in it after seeing the main drawback pointed out to me).
Could anyone please help me evaluate their relative values, whether backed up by mathematically well-founded theorycraft, or by sufficient actual-play experience and practical comparisons?