Are NP problems lower bounded by exponential order of growth?

My understanding of P. vs NP is quite limited. I can understand P refers to an algorithm with an upper bound (big O) with order of growth $$n^c$$ for some constant c and variable n. My question is, do NP problems have a hypothesized lower bound order of growth (big Omega) of $$c^n$$ for deterministic machines? I can’t find this stated anywhere and I’m trying to understand if this is something that is assumed or not.

Thanks.

Longest Path Problem on Graphs of Bounded Clique-Width

The wikipedia article mentions:

For graphs of bounded clique-width, the longest path can also be solved by a polynomial time dynamic programming algorithm.

However there are no references and I wasn’t able to find anything relevant in literature. Nor was I able to come up with a naive algorithm that would achieve this.

Would appreciate some pointers. If it’s important for the algorithm — I do have constructions of the graphs in terms of the node/union/join/rename operations.

Finding Smallest Frontier for Graphs of bounded “width”

Let $$G$$ be a graph and $$X=x_1,x_2,…,x_n$$ be an permutation/ordering of the vertex set of $$G$$. We then let $$S_i = \{x_j:j\le i\}$$, and $$F_i$$ be the number vertices $$v\in S_i$$ that are adjacent to some vertex $$u(v) \not\in S_i$$. We finally define $$F$$ to be a list of values $$F_i$$ sorted from largest to smallest. e.g. if $$F_1=2,F_2=1,F_3=6, F_4=2$$ we’d have $$F = 6,2,2,1$$ (we caution that in reality $$F_{i+1}-F_i\le 1$$ so the sequence features in the example could not occur)

In general, finding $$X$$ such that $$F$$ is lexicographically minimal is a task which I’d assume is NP-Hard.

However, letting $$\mathcal{G}_{k,t}$$ denote the family of graphs $$G$$ such that the vertex set of $$G$$ is partitioned in to $$t$$ parts $$V_1,\dots,V_t$$ such that $$V_i \le k$$ for all $$i$$, and $$|a-b|\ge 2$$ implies there is no edge $$(u,v)$$ in $$G$$ where $$u\in V_a$$ and $$v\in V_b$$.

For fixed $$k$$, and given $$G\in \ mathcal{G}_{k,t}$$, is there an algorithm that finds $$X$$ such that $$F$$ is lexicographically minimal, whose worst case run time is polynomial in $$t$$?

P=NP when number of input that gives 1 is bounded by polynomial

Suppose there exists some NP-complete problem such that the number of inputs that gives 1 as an output is bounded by a polynomial; that is, if the problem is $$f : \{0, 1 \}^* \to \{0, 1\}$$, then, for every $$n$$, $$|\{ x \in \{0, 1 \}^n : f(x) = 1\}| \leq p(n)$$ for some polynomial $$p$$. Then, I want to show that the existence of this problem implies $$P = NP$$. Is it somehow related to LEXSAT?

Finding the count of 0’s bounded by all 1’s

Given N * M 2-D matrix find out all the 0’s which are completely bounded by the all 1’s. ( This is not any online platform question ).

Example:

Input : Row : 3 , Col : 3

1 1 1

1 0 1

1 1 1

Output : 1

Input : Row: 2 Col: 2

1 1

1 1

Output : 0

Input : Row: 4 Col: 4

1 0 1 0

1 1 1 0

1 0 1 1

1 1 1 1

Output : 1

Input : Row : 4 Col : 5

1 1 1 0 1

1 0 1 1 1

1 1 1 0 1

0 0 1 1 1

Output : 2

I have tried to solve this question but not able to think about the optimized solution.

My Algorithm ( Brute Force Algorithm ):

1. Generating the all the Sub-Matrix of a given matrix
2. Checking whether given sub-matrix satisfies my given condition or not.
3. Increment the counter repeat 1 to 3.

Would this hit/miss houserule be balanced with bounded accuracy?

Would this houserule be balanced with the bounded accuracy system?

If an attack roll is made and “misses” by one, the defender takes a glancing blow. This glancing blow does not count as a successful hit. This glancing blow deals half damage of the attack to a maximum of 3. This damage cannot be augmented in any way. (ex. Sneak Attack, Divine Smite or GWM) This damage does not set off a spells damage. (ex. Wrathful smite) This does not count for reactionary damage. (ex. Hellish Rebuke) This does not count as a successful hit for things such as maintaining a barbarian’s rage.

This houserule is for players, enemies and NPC’s alike. Whats good for the players is good for the DM and vice versa.

The goal of this is to make battles feel more realistic. In D&D 5e you either “hit” or “miss” but in reality you could hit the person and do very minimal damage while glancing off armor.

Why is the Black-White Bakery Algorithm considered bounded?

As stated in Lamport’s papers for the bakery algorithm he states that the ticket numbers are unbounded specifically

The range of values of number is unbounded.

and

Fortunately, practical considerations will place an upper bound on the value of number[i] in any real application. For example, if processors enter the doorway at the rate of at most one per msec, then after a year of operation we will have number[i] < $$2^{35}$$—assuming that a read of number[i] can never obtain a value larger than one which has been written there.

What I don’t understand is, why does Taubenfeld’s Black-White Bakery Algorithm solves this problem and thus considered bounded.

I see no reason why the usage of the extra color register would solve such a problem?

Thanks

Bounded accuracy increases randomness?

I have been reading DnD 5e for a forthcoming campaign and I have a small concern with bounded accuracy: Proficiency bonus growth in a small rate (a level 20 character has only a +6 proficiency bonus whereas in previous DnD incarnations this would be a higher bonus in almost any case). Also magic weapons were nerfed (a Holly Avenger is a +3 sword where in previous version it was a +5 sword).

All in all I think that the designed outcome is that a high level character has a lower bonus to their rolls. I have read the benefits of this and while I agree in some points my concern is that since the bonus have shrinked the d20 has a bigger weight in the action outcome.

Is this a real problem or I am overthinking? It has been addressed by game designers? How do you overcome this in your campaigns?

Dominating set in bounded degeneracy and bounded degree graphs

I believe Minimum Dominating Set (MDS) is NP-hard for bounded degeneracy and their subset bounded degree graphs, but a paper appear to suggest tractability. Enumeration of Minimal Dominating Sets and Variants p.5

Proposition 1. Dom admits a polynomial delay algorithm when restricted to

1. Strongly chordal graphs.

2. Graph classes of bounded degeneracy.

The delay of the algorithm is the time between enumerating two MDSs after polynomial preprocessing (unless I have misunderstood the definitions on p. 3). According to graphclasses.org MDS is linear for strongly chordal graphs.

Does the paper imply $$\exp(o(n))$$ complexity of MDS for graphs of bounded degeneracy (and bounded degree)?

Why are armor bonuses considered to break Bounded Accuracy?

It seems to be a common opinion that enchanted armor and shields break the concept of Bounded Accuracy and eventually lead to imbalanced fights. I’ve seen people in RPG.SE saying this, and the popular Sane Magical Prices claims that armor bonuses

allow the user to boost their AC out of the range of bounded accuracy.

What do people mean by this? Armor is overpowered? Aren’t all monsters still able to hit even high armor PCs, with at least a 5% chance? Even if armor scales a lot, many monsters have large to-hit bonuses, or save-based attacks.