Algorithm for cutting rods with minimum waste

Given a set of cuts and their lengths we need to find out the minimum number of rods (of constant length) and the cuts required which will lead to minimum wastage.

Here we bundle the rods and cut them all at once. So we can have a bundle with any number of rods.

For example:

Input data – Consider a rod of length 120 inches

( Quantity of Cuts Required, Length (in inches) ) = (5,16") , (5,30") , (24,36") , (4,18") , (4,28") , (6,20")

So here we required cuts such that we get 5 rods of 16 inches, 5 rods of 30 on.


Imagine each row (in the image) is a rod of 120 inches and each table is a bundle with rows as the number of rods in that bundle. So the first table is a bundle with 5 rods with cuts [16,30,36,36] and second table is a bundle of 4 rods with cuts [18,28,36,36] and so on. We can see that we have satisfied the input data we get (5,16") five rods of sixteen inches and so on.

enter image description here

Given input with (just like above) number of cuts and their lengths. how do we find the bundle of rods and their cuts having minimum amount of wastage?

Minimum pumping length of (01)*

Michael Sipser offers the definition:

The pumping lemma says that every regular language has a pumping length p, such that every string in the language can be pumped if it has length p or more. If p is a pumping length for language A, so is any length p′ ≥ p. The minimum pumping length for A is the smallest p that is a pumping length for A.

Now, (01)* in set notation is {€, 01,0101,010101….} Taking minimum pumping length = 1, according to the definition, we have the statement if a string in the language has length 1 or more, it can be pumped.

This statement is true for all elements of the above mentioned set, so can the minimum pumping length be 1?

p.s. the minimum pumping length for (01)* has been asked here before but it doesn’t answer my doubt that since the condition holds for minimum pumping length = 1, why is it not the answer?

Why minimum vertex cover problem is in NP

I am referring to the definition of the minimum vertex cover problem from the book Approximation Algorithms by Vijay V. Vazirani (page 23):

Is the size of the minimum vertex cover in $ G$ at most $ k$ ?

and right after this definition, the author states that this problem is in NP.

My question: What would be a yes certificate?

Indeed, our non-deterministic algorithm could guess a subset of vertices, denoted by $ V’$ , and we can verify if $ V’$ is a vertex cover of some cardinality in polynomial time, but how could we possibly show that $ V’$ is minimum in polynomial time?

Intuition of lower bound for finding the minimum of $n$ (distinct) elements is $n-1$ as dealt with in CLRS

I was going through the text Introduction to Algorithms by Cormen et. al. where there was a discussion regarding the fact that finding the minimum of a set of $ n$ (distinct) elements with $ n-1$ comparisons is optimal as we cannot do better than it, which means that we need to show that running time of algorithm which finds the minimum of a set of $ n$ elements is $ \Omega(n)$ .

This is what the text says to justify the lower bound.

We can obtain a lower bound of $ n – 1$ comparisons for the problem of determining the minimum. Think of any algorithm that determines the minimum as a tournament among the elements. Each comparison is a match in the tournament in which the smaller of the two elements wins. Observing that every element except the winner must lose at least one match, we conclude that $ n-1$ comparisons are necessary to determine the minimum.

Now I could make the thing out in my own way as:


What I have done is a top down comparison, but the authors by their words "Observing that every element except the winner must lose at least one match, we conclude that $ n-1$ comparisons are necessary to determine the minimum." seems they are pointing to some bottom up approach which unfortunately I cannot make out.


"That every element except the winner must lose at least one match" $ \implies$ "$ n-1$ comparisons are necessary to determine the minimum".

Minimum cost of “signal” cover in a tree with DP

I’m given a (not necessarily binary) tree. Now every node can have a signal with range $ i$ , reaching all nodes being at most $ i$ edges away. The cost of a signal is determined by a function $ f(n, i)$ with $ n$ being a node and $ i$ being the signal strenght. The cost for each node may vary, the only assumption one can make is that $ f(n, i) \geq f(n, j)$ for $ i > j$ .

I need to find the minimum cost to cover the whole tree.



For $ f(n, i) = (i + 1)^2$ , the minimum cost would be 7:

Setting a signal with strenght 0 for every node covers the whole tree for the cost of 7. Setting a signal with strenght 1 for the nodes $ b$ and $ c$ covers the tree for the cost of 8 and setting a signal with strenght 2 for node $ a$ results in a cost of 9.

Using Dynamic Programmming this task should be achieved in $ O(n^2)$ . This is an assignment so I’d be grateful for tips.

Data structure implementation of MST (Minimum spanning tree) through Fibonacci heaps

How can a fibonacci heap store the information needed by the algorithm? In order to achieve good efficiency, when would you run the Consolidate routine?

Algorithm :

MST(G) 2 T ← {} // set that will store the edges of the MST 3 for i ← 1..n 4 Vi ← {i} 5 Ei ← {(i, j) : j is a vertex and (i, j) is an edge of G} // set of all edges incident with vertex i 6 end for 7 while there is more than one set Vi 8 choose any Vi 9 extract minimum weight edge (u, v) from Ei 10 one of the endpoints u of this edge is in Vi ; let Vj be the set that contains the other endpoint v 11 if i 6= j then 12 T ← T ∪ {(u, v)} 13 combine Vi and Vj into Vi (destroying Vj ) 14 combine Ei and Ej into Ei (destroying Ej ) 15 end if 16 end while 17 return T 18 end MST 

Would you have to add any additional fields to nodes in the heaps or use any additional data structures?

Minimum pumping length of concatenation of two languages

there’s this small part of my homework that I just can’t figure out.

Let us denote $ p(L)$ as the minimum pumping length of some language $ L$ . I’m supposed to find two regular languages $ A,B$ so that

$ $ p(AB)=p(A)+p(B) $ $

But whatever I try, I can only find languages so that

$ $ p(AB) < p(A)+p(B) $ $

I’ve been sitting at this the whole day and I’m just going in circles. Can someone please give me a hint?

Will using Convergent Future give you a critical success if the minimum number you need to hit is 20?

Convergent Future (p185 EGtW) States:

When you or a creature you can see within 60 feet of you makes an attack roll, an ability check, or a saving throw, you can use your reaction to ignore the die roll and decide whether the number rolled is the minimum needed to succeed…

If that number is a “20” does it meet the requirements of a critical success? (p194 PHB)

Minimum spanning tree with small set of possible edge weights

Given a undirected graph which only has two different edge weights $ x$ and $ y$ is it possible to make a faster algorithm than Prim’s algorithm or Kruskal’s algorithm?

I saw on Kruskal’s algorithm that it already assumes that the edges can be sorted in linear time with count sort, so then can we gain any benefit by knowing the weights of all of the edges ahead of time?

I don’t think there is any benefit, but I am not sure…

Minimum Weighted Vertex Cover, Dynamic Programming


So my solution in my mind right now:

Let $ OPT(n, boolean)$ be the minimum vertex cover rooted at vertex n that includes that vertex if boolean is true, false otherwise.

So my idea is:

If a vertex is not included, any edges leading away from it, their vertices are to be part of the vertex cover. $ OPT(n, F) = OPT(n.left, T) + OPT(n.right, T)$

If a vertex is included, it can or can not be included, the vertex at the other end of an edge. $ OPT(n, T) = min\{OPT(n.left, T) + OPT(n.right, T), OPT(n.left, F) + OPT(n.right, F), OPT(n.left, F) + OPT(n.right, T), OPT(n.left, T) + OPT(n.right, F)\} + n.weight $

Then we can solve this problem with $ min\{OPT(root, T), OPT(root, F)\}$

Is this the right idea? Is there anything I might be missing? If so, what would be the correct approach?