Minimum spanning tree with small set of possible edge weights

Given a undirected graph which only has two different edge weights $ x$ and $ y$ is it possible to make a faster algorithm than Prim’s algorithm or Kruskal’s algorithm?

I saw on Kruskal’s algorithm that it already assumes that the edges can be sorted in linear time with count sort, so then can we gain any benefit by knowing the weights of all of the edges ahead of time?

I don’t think there is any benefit, but I am not sure…

How do you determine constraint length,k is small value or large value?

enter image description here

[For small values of k, this is done with a widely used algorithm developed by Viterby(Forney, 1973]

My question is how do they determine k value is considered small or big? What is the threshold value for k? For example, length of this code is 7 and they considered it as small value. How about 10? or how about 20? Are they considered as small value or large value? I’m curious about the threshold value of k.

This is an excerpt from Computer Network Book by Andrew S. Tanenbaum

The data link layer(chapter 3), Page 208, Fifth edition.

Does a small race do extra damage when enlarged by the Enlarge/Reduce spell?

Me and a friend were discussing gnome barbarians, and the fact that small creatures have disadvantage on an attack when using a weapon with the heavy property. I brought up that the “Enlarge/Reduce” spell would negate this disadvantage by making the gnome size class medium temporarily.

Though, the “Enlarge/Reduce” spell also causes creatures to do 1d4 extra damage with weapons they use. If a gnome/kobold/halfling/goblin were to hold a greatsword, and have a wizard enlarge them, it appears that, not only would the disadvantage imposed by being a small race be removed, but they would also do 1d4 damage more than any normally medium character using the weapon. This makes no sense to me though, am I reading everything correctly?

Problems for which a small change in the statement causes a big change in time complexity

I know that there are several problems for which a small change in the problem statement would result in a big change in its (time) complexity, or even in its computability.

An example: The Hamiltonian path problem defined as

Given a graph, determine whether a path that visits each vertex exactly once exists or not.

is NP-Complete while the Eulerian path problem defined as

Given a graph, determine whether a trail that visits every edge exactly once exists or not.

is solvable in linear time with respect to the number of edges and nodes of the graph.

Another example is 2-SAT (polynomial complexity) vs K-SAT (NP-complete) although someone could argue that 2-SAT is just a specific case of K-SAT.

What do you call this kind of problems –if they even have a name? Can someone provide a list of other examples or some references?

Example of *small* non monotone circuit such that any equivalent monotone circuit has greater size?

A “general” Boolean (combinatoiral) circuit is a labeled (with the labels: AND, OR, NOT, IN, OUT), directed, acyclic graph, that satisfies:

  1. fan-in=2 for the AND and OR nodes
  2. fan-n=1 for the NOT nodes
  3. fan-in=0 for the IN nodes
  4. fan-out=0 to exactly one node (the OUT node)
  5. Unbounded fan-out to the rest of the nodes (but the OUT node)

A monotone circuit is a Boolean circuit with 0 vertices labeled as “NOT”.

The size of a circuit is the number of “gates” (vertices with labels “AND”, “OR” or “NOT”) it contains.

In Yuval’s answer here I’ve learned of two examples (Tardos function and bipartite perfect matching) where it has been proven that monotone circuits admit greater size than general Boolean circuits, but I cannot get the intuition, as I don’t have any concrete small size example in hand.

Hence, my question is: could you please supply me with an example of a small (say, up to 10-20 gates) non monotone circuit such that any equivalent monotone circuit has greater size?

Can small characters really carry that much?

Let’s take a gnome for the example. Here is what the PHB 37 says for its size:

Size. Gnomes are between 3 and 4 feet tall and average about 40 pounds. Your size is small.

The PHB 176 also says the following for the carrying capacity:

Carrying Capacity. Your carrying capacity is your Strength score multiplied by 15. This is the weight (in pounds) that you can carry, which is high enough that most characters don’t usually have to worry about it.

[…]

Size and Strength. Larger creatures can bear more weight, whereas Tiny creatures can carry less. For each size category above Medium, double the creature’s carrying capacity and the amount it can push, drag or lift. For a Tiny creature, halve these weights.

If this gnome has a Strength of 10, it means it can carry 10*15=150 pounds ! More than the triple of its own weight !

Am I missing something or can small characters really carry that much ?

PS: I know D&D isn’t meant to be a realistic simulation, but still.

Selecting vertices in a graph in an order to keep the cut as small as possible

I am given an undirected graph. Initially all vertices are white. I need to color them black in such an order that the maximum number of vertices which are on the border between black and white regions is minimal. Is there an algorithm to find an optimal order for that?

More formally. We are given an undirected, connected graph $ G = (V,E)$ consisting of $ n$ vertices. I am looking for a sequence of increasing subsets of vertices $ V_0, V_1, …, V_n$ , where $ V_0 = \emptyset$ , $ V_{i+1} = V_i \cup \{v\}$ for some $ v\in V$ , and $ V_n = V$ . For any subset $ W \subset V$ we define a cost function $ c(W)$ as the number of “border” vertices, i.e. size of $ \{w\in W : \exists m\in V\setminus W: (w,m)\in E \}$ . I am looking for such a sequence where $ \max_i(c(V_i))$ is minimal.

I feel my problem is somehow related to maximum flow algorithms, the same way as minimum-cut problem is. I think there must be a name for this (or similar) problem. However, as I was mixing “minimal cut” in various ways in the search engines, I was unable to find it.

A bit of context: I have a series of tasks to perform (edges), each loading two files from a disk (vertices). In order to speed up the process, I don’t want the reload every pair of files every time. Instead, I want to keep some files in memory so that I can reuse them when another task uses the same file. But I cannot keep all files that there are because of memory constriants. The above sequence would help me select an optimal processing order, keeping the number of active files to minimum.

How small can I make a character, mostly permanently, at 18th level?

Druids can wild shape once and become diminutive for (functionally) the whole day.

Other than that, how small can a character get – either permanently or with a few rounds once per day – on a regular basis?

Assume that the character in question has the ability to cast both divine and arcane spells of up to 9th level.

Costly material components should be avoided, but expensive magic items are a-okay; this character ideally wants to be fine 24/7.

For the purposes of this question, assume that Wish/Miracle won’t work (ie., that the GM ruled that such a request is highly likely to trigger the “literal but undesirable fulfillment” clause of Wish, and that the divine power behind Miracle will simply say “no”).

Google says text too small too read when all text on the page is 14 point

I’m having no end of problems trying to get my site listed on Google. It was sitting right at the top of the rankings. Then, after I’d added a couple of extra sponsor logos, the mobile crawler decided the home page has “text to small to read.” Every bit of text on all 63 pages in the wbsite is 14 points or more. It has said this is the case on 5 pages of the site… I’ve checked each page and there’s nothing under 14 point. In addition it’s saying 7 pages have clickable elements too close together and that 5 have content wider than the screen. I’ve checked each page rigouously and re-submitted with no luck… the same 5 pages keep getting singled out. The site is for a festival that takes place in 3 weeks, so solving this problem is urgent