## Recurrence relation for the number of “references” to two mutually recursive function

I was going through the Dynamic Programming section of Introduction to Algorithms(2nd Edition) by Cormen et. al. where I came across the following recurrence relations in the assembly line scheduling portion.

$$(1),(2),(3)$$ are three relations as shown.

$$f_{1}[j] = \begin{cases} e_1+a_{1,1} &\quad\text{if } j=1\ \min(f_1[j-1]+a_{1,j},f_2[j-1]+t_{2,j-1}+a_{1,j})&\quad\text{if} j\geq2\ \end{cases}\tag 1$$

Symmetrically,

$$f_{2}[j] = \begin{cases} e_2+a_{2,1} &\quad\text{if } j=1\ \min(f_2[j-1]+a_{2,j},f_1[j-1]+t_{1,j-1}+a_{2,j})&\quad\text{if} j\geq2\ \end{cases}\tag 2$$

(where $$e_i,a_{i,j},t_{2,j-1}$$ are constants for $$i=1,2$$ and $$j=1,2,3,…,n$$)

$$f^\star=\min(f_1[n]+x_1,f_2[n]+x_2)\tag 3$$

The text tries to find the recurrence relation of the number of times $$f_i[j]$$ ($$i=1,2$$ and $$j=1,2,3,…,n$$) is referenced if we write a mutual recursive code for $$f_1[j]$$ and $$f_2[j]$$. Let $$r_i(j)$$ denote the number of times $$f_i[j]$$ is referenced.

They say that,

From $$(3)$$,

$$r_1(n)=r_2(n)=1.\tag4$$

From $$(1)$$ and $$(2)$$,

$$r_1(j)=r_2(j)=r_1(j+1)+r_2(j+1)\tag 5$$

I could not quite understand how the relations of $$(4)$$ and $$(5)$$ are obtained from the three corresponding relations.

Thought I could make out intuitively that as there is only one place where $$f_1[n]$$ and $$f_2[n]$$ are called, which is in $$f^\star$$, so probably in $$(4)$$ we get the required relation.

But as I had not encountered such concept before I do not quite know how to proceed. I would be grateful if someone guides me with the mathematical prove of the derivation as well as the intuition, however I would prefer an alternative to mathematical induction as it is a mechanical cookbook method without giving much insight into the problem though (but if in case there is no other way out, then I shall appreciate mathematical induction as well provided the intuition is explained to me properly).

## trouble solving the recurrence 4T(n/2) + n

I am having trouble figuring out how to solve this recurrence problem…

\begin{aligned} &4T(n/2) + n \ = &4(4T(n/4) + n/4) + n \ = &16T(n/4) + 2n \ = &4^kT(n/2^k) + kn \end{aligned}

I lose the trail here and I cannot figure out how to finish it and actually find the complexity. Can anyone help? How can this be done?

## Recurrence Relations for Perfect Quad Trees (same as binary trees but with 4 children instead of 2)

I have to write and solve a recurrence relation for n(d), showing how I arrive at the formula and solve the recurrence relation, showing how I arrive at the solution. Then prove my answer is correct using induction for perfect quad trees which are basically binary trees but with 4 children at each node rather than 2 and the leaf nodes in the deepest layer have no children. Nodes at precisely depth d is designated by n(d). For example, the root node has depth d=0, and is the only node at that depth, and so n(0) = 1

Does this mean it would be T(n)= 4T(n/4) + d ? then prove

I’m really confused and would appreciate any help or resources.

## How to solve recurrence $T(n) = 5T(\frac{n}{2}) + n^2\lg^2 n$

I have tried solving it using substitution. Apparently, it is exact for some $$n$$ and the order of the general solution can be found from this exact solution.

By substitution I got the following (not sure if it is correct):

$$T(n) = 5^kT(n) + \sum_{i = 0}^{k}{5^{i}\left(\frac{n}{2^{i}}\right)^{2}\lg^{2}\left(\frac{n}{2^{i}}\right)}$$

I am not sure how to proceed from this. I don’t even know if this approach is correct so far. How do I solve this recurrence?

## closed form for the following recurrence relation

$$t(k,m,n) = \frac{m}{k}t(k-1,m-1,n-1)+ \frac{k-m}{k} t(k-2,m-1,n-1) + \frac{k-m}{k}$$ , where $$k \geq m \geq n$$

## Asymptotic of divide-and-conquer type recurrence with non-constant weight repartition between subproblems and lower order fudge terms

While trying to analyse the runtime of an algorithm, I arrive to a recurrence of the following type :

$$\begin{cases} T(n) = \Theta(1), & \text{for small enough n ;}\ T(n) \leq T(a_n n + h(n)) + T((1-a_n)n+h(n)) + f(n), & \text{for larger n .} \end{cases}$$ Where:

• $$a_n$$ is unknown and can depend on $$n$$ but is bounded by two constants $$0.
• $$h(n)$$ is some “fudge term” which is negligeable compared to $$n$$ (say $$O(n^\epsilon)$$ for $$0\leq \epsilon < 1$$).

If $$a_n$$ was a constant, then I could use the Akra-Bazzi method to get a result. On the other hand, if the fudge term didn’t exist, then some type of recursion-tree analysis would be straight-forward.

To make things a bit more concrete, here is the recurrence I want to get the asymptotic growth of:

$$\begin{cases} T(n) = 1, & \text{for n = 1;}\ T(n) \leq T(a_n n + \sqrt n) + T((1-a_n)n+\sqrt n) + n, & \text{for n \geq 2 } \end{cases}$$ Where $$\frac{1}{4} \leq a_n \leq \frac{3}{4}$$ for all $$n\geq 1$$.

I tried various guesses on upper bounds or transformations. Everything tells me this should work out to $$O(n\log(n))$$ and I can argue informaly that it does (although I might be wrong). But I can only prove $$O(n^{1+\epsilon})$$ (for any $$\epsilon>0$$), and this feels like something that some generalization of the Master theorem à la Akra-Bazzi should be able to take care of.

Any suggestions on how to tackle this type of recurrence?

## Show that recurrence is O(phi^logn)

I have a function whose time complexity is given by the following recurrence:

$$\begin{equation*} T(n) = \begin{cases} \mathcal{O}(1) & \text{for } n=0\ T(k)+T(k-1)+\mathcal{O}(1) & \text{for } n=2k\ T(k)+\mathcal{O}(1) & \text{for } n=2k+1\ \end{cases} \end{equation*}$$

and I have to prove that $$T(n)\in \mathcal{O}(\phi^{log_2 n})$$

Where $$\phi$$ is the golden ratio, $$(1 + \sqrt5)\over2$$

I think I could prove it by induction but, how would I go on about it if I didn’t know that $$T(n)\in \mathcal{O}(\phi^{log_2 n})$$ in the first place?

Thanks!

## Prove by induction that the recurrence form of bubble sort is $\Omega(n^2)$

The recurrence form of bubble sort is $$T(n)=T(n-1)+ n- 1$$

How can I prove by induction that this is $$\Omega(n^2)$$?

I’m stuck with $$T(n+1) \geq cn^2 + n = n(cn+1)$$

## Proving the recurrence for the “number of ways to make change” problem

I’m studying the famous making change problem. (Post 1 and Post2).

Suppose you were given three coins (1, 2, and 5) and a bill of amount 10 and you want to compute the number of ways you could exchange the bill into coins. In this specific example, there are 4 ways.

The solution involves ordering the coins from the largest to the smallest (5, 2, 1). We then can divide the solution into three subproblems:

(1) the number of ways to change using coins (1) and amount (10 – 1). Call it $$F_1(10-1)$$.

(2) the number of ways to change using coins (1,2) and amount (10 – 2). Call it $$F_2(10-2)$$.

(3) the number of ways to change using coins (1,2,5) and amount (10 – 5). Call it $$F_3(10-5)$$.

The final recurrence would be $$F(10) = F_1(10-1) + F_2(10-2) + F_3(10-5)$$.

I understand the intuition and usually dynamic programming problems have very straight forward proofs (Cut and Paste Arugments). But with this problem, I have no idea where to start? The proof is usually by contradiction and arguing how it’s not possible for $$F(n)$$ not to be the optimal value. But I’m not sure here. Any hints on how to even start with this proof? Basically, why is the above recurrence correct?

## How to solve this recurrence relation using substitution method

Can anyone explain to me how to demonstrate that,

T (n, d) ≤ T (n − 1, d) + O(d) + d/n (O(dn) + T (n − 1, d − 1))

is solved by

T (n, d) ≤ bnd! (b is a constant)

using the substitution method?

I have done this but I don’t know if it is correct.

• T (n-1, d) ≤ b(n-1)d!
• O(d) ≤ bd
• d/n (O(dn) + T (n − 1, d − 1)) ≤ d/n (bnd + b(n-1)d(n-1)!)