## What happens if you reroll and the new highest die is lower than the original?

I was reading the Cthulhu Dark rules again, and I noticed something that wasn’t specified.

The section on rerolling says, in part:

If you included your Insanity die in the roll and you’re not happy with the result, you may reroll (all the dice). If you didn’t include your Insanity die before, you may add it now and reroll.

Afterwards, look at the new result. As before, the highest die shows how well you do.

It does not say anything about what happens if the highest die after rerolling is lower than the highest die before the reroll. Do you still use the new roll if the highest die is lower than the higher die before rerolling?

## Can a monster cast a higher level spell using a lower level spell slot? (Shadow Fey Enchantress)

Is there a (monster creation) rule that allows a higher level spell to be cast using a lower level? I ask because I came across the "Shadow Fey Enchantress" (Kobold Press "Tome of Beasts", pages 170-172), as it states:

Spellcasting. The shadow fey is a 10th-level spellcaster. Her spellcasting ability is Charisma (save DC 15, +7 to hit with spell attacks). She knows the following bard spells.

Cantrips (at will): blade ward, friends, message, vicious mockery
1st level (4 slots): bane, charm person, faerie fire
2nd level (3 slots): enthrall, hold person
3rd level (3 slots): conjure fey, fear, hypnotic pattern
4th level (3 slots): confusion, greater invisibility, phantasmal killer
5th level (2 slots): animate objects, dominate person, hold monster

However, Conjure Fey is a level 6 spell: https://www.dndbeyond.com/spells/conjure-fey

I checked with the Errata, but it doesn’t mention this as an error. Perhpaps it was overlooked, or perhaps there is a monster creation mechanic that I’m not aware of?

EDIT: Bonus points for a suitable replacement spell (if indeed an error)!

## indexer links need backlinks on lower tier to get indexed ?

do indexer urls need a tier below them to get indexed ?

## Background

I’m playing a mid-level artificer (artillerist) who’s a disgruntled veteran with a missing limb who, disillusioned by the leaders’ willingness to send soldiers to their deaths, has retired from the army and opened a shop. An adventure hook has people steal his work-in-progress masterpiece and now I need to find a fitting item he was trying to create.
Because of this background, the item he would be most interested in would be something that helps ordinary soldiers without magic powers survive the horrors of the battlefield. It might be something that protects a group of people from hostile spells or something that provides healing to them, similar to the artificer’s Protector cannon.

## Criteria

• I am trying to find an officially published item before resorting to homebrew (UA is probably fine, as is basic refluffing)
• The DM has ruled that the item should be below legendary rank, so very rare at most
• I probably won’t be held to strict prerequisites such as being able to cast every spell going into the items myself, but the item should still basically fit the artificer flavour
• The item should be usable by someone who cannot cast spells
• The item should be able to affect a group, not just the carrier
• The item should be defensive in nature

## My own research

I’ve gone through the "warding" and "healing" categories of magic items on D&D Beyond and found very little. There are almost no items that work on groups and those that do tend to be musical instruments or magic staves that need the user to be a spellcaster.
In general it seems that antimagic items aren’t really a thing in 5e. An item that can cast Antimagic Field on he regular would probably be in the legendary category and a Ring of spell Storing would again require a (powerful) spellcaster to be useful.
An ideal solution would be something like a banner of protection or an Eldritch Cannon: Protector that doesn’t need an artificer to be present. I’ve also considered something like a Ring of Regeneration, but that’s again a one-person item.

## Are NP problems lower bounded by exponential order of growth?

My understanding of P. vs NP is quite limited. I can understand P refers to an algorithm with an upper bound (big O) with order of growth $$n^c$$ for some constant c and variable n. My question is, do NP problems have a hypothesized lower bound order of growth (big Omega) of $$c^n$$ for deterministic machines? I can’t find this stated anywhere and I’m trying to understand if this is something that is assumed or not.

Thanks.

## Why decision tree method for lower bound on finding a minimum doesn’t work

(Motivated by this question. Also I suspect that my question is a bit too broad)

We know $$\Omega(n \log n)$$ lower bound for sorting: we can build a decision tree where each inner node is a comparison and each leaf is a permutation. Since there are $$n!$$ leaves, the minimum tree height is $$\Omega(\log (n!)) = \Omega (n \log n)$$.

However, it doesn’t work for the following problem: find a minimum in the array. For this problem, the results (the leaves) are just indices of the minimum element. There are $$n$$ of them, and therefore the reasoning above gives $$\Omega(\log n)$$ lower bound, which is obviously an understatement.

My question: why does this method works for sorting and doesn’t work for minimum? Is there some greater intuition or simply "it just happens" and we were "lucky" that sorting has so many possible answers?

I guess the lower bound from decision tree makes perfect sense: we do can ask yes/no questions so that we need $$O(\log n)$$ answers: namely, we can use binary search for the desired index. My question still remains.

## Scheduling jobs online on 3 identical machines – a lower bound of 5/3

Consider the Online Scheduling Problem with $$3$$ identical machines. Jobs, with arbitrary size, arrive online one after another and need to be scheduled on one of the $$3$$ machines without ever moving them again.

How can I show, that there can’t be any deterministic Online-Algorithm which achieves a competitive ratio of $$c<\frac{5}{3}$$.

This should be solved by just giving some instance $$\sigma$$ and arguing that no det. algorithm can do better. Same can easily be done for $$2$$ machines and $$c<\frac{3}{2}$$. Sadly I can’t find any solution to this (more or less) textbook answer.

## I have a scenario that will likely never happen, but I am curious in how it would work.

First, the preliminary, Combining Magical Effects:

The effects of different spells add together while the durations of those spells overlap. The effects of the same spell cast multiple times don’t combine, however. Instead, the most potent effect — such as the highest bonus — from those castings applies while their durations overlap, or the most recent effect applies if the castings are equally potent and their durations overlap.

Now the setup:

Bob the 3nd-level Generic Wizard and Doug the 3rd-level Earth Wizard face off. Bob knows Doug’s favorite tactic and readies a spell. Doug casts Earth Tremor under Bob. The spell say, "You cause a tremor in the ground within range. Each creature other than you in that area must make a Dexterity saving throw."

But Bob was ready and casts Earth Tremor as a 2nd-level spell for his reaction making it more potent (more damage) in the same area.

## So what happens?

Doug’s casting means Doug is not targeted by the tremor. But as a reaction Bob casts a more powerful version where Bob is not targeted by the spell. So does that overpower Doug’s spell? And if so, does that mean Bob no longer has to make a Dexterity save and Doug does?

There may be other spells that do this but Earth Tremor was the first I found with wording stating that regardless of the target point, the caster is not affected.

## Lower bound on comparison-based sorting

I have a question from one of the exercises in CLRS.

Show that there is no comparison sort whose running time is linear for at least half of the $$n!$$ inputs of length $$n$$. What about a fraction of $$1/n$$ of the inputs of length $$n$$? What about a fraction $$1/2^n$$?

I have arrived at the step where for a linear time sorter, there will we $$2^n$$ nodes in the decision tree, which is smaller than the $$n!$$ leaves so this is a contradiction but I am unsure of how to formally write out the proof and extend it to the other fractions? The question also states that "for at least half of the $$n!$$ inputs of length $$n$$". I do not quite understand how it affects the number of leaves in the decision tree as any input of length $$n$$ will have $$n!$$ possible permutations.

## Can an algorithm complexity be lower than its tight low bound / higher than its tight high bound?

The worst case time complexity of a given algorithm is $$\theta(n^3logn)$$.
Is it possible that the worst time complexity is $$\Omega(n^2)$$?
Is it possible that the worst time complexity is $$O(n^4)$$?
The average time complexity is $$O(n^4)$$?

IMO it is possible as long as you control the constant $$c$$, but then what’s the point of mentioning any other bound than the tight bounds?