## Doubts in some inequalities used in asymptotic analysis

I was going through this recent paper, and I had a couple of doubts in the final analysis of complexity of the scheme (you don’t need to go through the whole paper). I included three questions here, as I thought they seem to simple questions.

1. In Pg 17 (last four lines, see after equation 7), there is this inequality that is used (here, $$k(n) = \sqrt{\frac{n\delta(n)}{\log n}}$$ and $$\delta(n) = o(n / \log n)$$):

$$\frac{\binom{n}{a}}{\binom{^n/_k}{^a/_k}^k} \leq (^n/_k)^k$$

Can I know a proof for it?

1. Similarly, in the beginning of Pg 18, how is this possible? (the above inequality is used to get here though, $$(n / k)^{k / 2}$$ thing, and don’t worry about the meaning of $$\mathtt{ss}$$)

$$\mathtt{ss} \leq \sqrt{\binom{n}{\frac{n}{2}-\delta(n)}}\cdot (n / k)^{k / 2} \cdot \binom{k}{2\delta(n)} \cdot 2^{(2\delta(n)+1)n / k} \leq 2^{n / 2 + o(n)}$$

1. Also, this one might be a bit trivial, in Pg 17, one inequality above equation 7, the $$O(n)$$ term is dropped, isn’t that relevant?
Posted on Categories proxies

## Proof involving asymptotic complexity

The question in Proof of big-o propositions asked to prove:

$$O(f(n))=O(g(n))\iff\Omega(f(n))=\Omega(g(n))\iff\Theta(f(n))=\Theta(g(n))$$

The accepted answer starts the proof with:

Suppose that $$O(f(n))=O(g(n))$$. It is easy to check that $$g(n)=O(g(n))$$ … and so $$O(f(n))=O(g(n))$$ implies that $$g(n)=O(f(n))$$.

I believe that there is a mistake in quoted part of the answer above.

It claims that $$O(f(n))=O(g(n))\tag{1}$$

and $$g(n)=O(g(n))\tag{2}$$

implies

$$g(n)=O(f(n))\tag{3}$$

Using the interpretation given in “Introduction to Algorithms (CLRS) Edition 3, page 50”, (1) is interpreted as $$\forall\Phi(n)\in O(f(n)), \exists\Psi(n)\in O(g(n))$$ such that $$\Phi(n)=\Psi(n)$$.

Additionally, the definition of $$O$$-notation given in “Introduction to Algorithms (CLRS) Edition 3, page 47” states:

$$O(g(n))=\{f(n):$$ there exist positive constants $$c$$ and $$n_0$$ such that $$0 \leq f(n)\leq c\cdot g(n)$$ for all $$n\geq n_0\}$$

Considering the counter example whereby $$f(n)=n$$ and $$g(n)=n^2$$. Then $$O(n)=O(n^2)$$ and $$n^2=O(n^2)$$ hold, but $$n^2\neq O(n)$$, thereby contradicting the assertion given in the accepted answer. Note: “$$\neq$$” in this case refers to “$$\not \in$$“.

Therefore, I would like to ask if the answer provided is wrong or my interpretation and/or reasoning is wrong.

Posted on Categories proxies

## Asymptotic notation and random variables

I have two random variables $$X$$ and $$Y$$ and I want to bound the value of one in terms of the other (for now, I don’t care about the actual distribution of their values).

Suppose that the two variables can have different distributions with values chosen from $$[1, n]$$. But $$X$$ is always upper bounded by $$Y \cdot \log{n}$$. Can I write this as $$X = O(Y\log{n})$$ (if I care about the behavior for large $$n$$). I’m not sure what is the convention wrt to random variables and asymptotic notation.

Posted on Categories proxies

## The role of asymptotic notation in $e^x=1+𝑥+Θ(𝑥^2)$?

I’m reading CLRS and there is the following:

When x→0, the approximation of $$e^x$$ by $$1+x$$ is quite good: $$e^x=1+𝑥+Θ(𝑥^2)$$

I suppose I understand what means this equation from math perspective and, also, there is an answer in another cs question. But I don’t understand some things, so have a few questions.

1. Why do they use $$Θ$$ here and why do they use $$=$$ sign?
2. Is it possible to explain how the notation here is related to the author’s conclusion that $$e^x$$ is very close to $$1 + x$$ when $$x \rightarrow 0$$?
3. Also, how is it important here that $$x$$ tends to $$0$$ rather than to $$\infty$$ as we usually use asymptotic notations?

I’m sorry if there are a lot of questions and if they are stupid, I’m just trying to master this topic.

Posted on Categories proxies

## Asymptotic formula for the number of path-connected graphs

It can be shown that the set of graphs with $$N$$ vertices $$G_N$$ has cardinality:

$$\begin{equation} \lvert G_N \rvert = 2^{N \choose 2} \tag{1} \end{equation}$$

Recently, I wondered how much bigger $$\lvert G_N \rvert$$ is compared to the number of graphs with $$N$$ vertices that are path-connected, $$PC_N$$.

If we denote the set of Hamiltonian paths by $$H_N$$ we can easily show that:

$$\begin{equation} H_N \subset PC_N \tag{2} \end{equation}$$

and by Stirling’s approximation:

$$\begin{equation} \lvert H_N \rvert = N! \sim \sqrt{2\pi N} \big(\frac{N}{e}\big)^N \tag{3} \end{equation}$$

and therefore $$\lvert PC_N \rvert$$ must grow exponentially fast.

Now, I’m curious about asymptotic formulas for $$\lvert PC_N \rvert$$ and I’m fairly confident that:

$$\begin{equation} \frac{\lvert PC_N \rvert}{\lvert G_N \rvert} \leq e^{-N} \tag{4} \end{equation}$$

but I suspect that a proof for this statement would be fairly subtle.

## Find an Asymptotic Upper Bound using a Recursion Tree

The problem is this: Use the recursion-tree method to give a good asymptotic upper bound on T(n) = 9T(n^(1/3)) + Big-Theta(1). I am able to get the tree started and find a pattern with the sub-problems, but I am having difficulty finding the total cost of the running times throughout the tree. I can not figure out how to get the number of sub-problems at depth i when n=1. I have a feeling the answer is O(log3(n)), but I can not verify that at the moment. Any help would be appreciated.

T(n) = 9T(n^(1/3)) + Big-Theta(1) can be written as: T(n) = 9T(n^(1/3)) + C, where C is some constant since any constant will always be treated as 1 asymptotically. My recursion tree is explained by each level below: Level 0: This is the constant C

Level 1: T(n^(1/3)) is written 9 times which represent the sub-problems of C. This adds up to 9cn^(1/3).

Level 2: Each of the 9 sub-problems from level 1 gets divided into 9 more sub-problems, which are each written as T(n^(1/9)). All of these add up to 81cn^(1/9).

Sub-Problem Sizes and Nodes: The number of nodes at depth i is 9^i We know that the sub-problem size for a node at depth i is n^(1/(3^i)). The problem size hits n=1 when this size equals 1. Solving for i yields:

(n^(1/(3^i)))^(3i) = 1^(3i) n = 1^(3i). This results in n being 1 which doesn’t give a logarithmic form!

Posted on Categories proxies

## The optimal asymptotic behavior of the coefficient in the Hardy-Littlewood maximal inequality

It is well-known that for $$f \in L^1(\mathbb{R^n})$$,$$\mu(x \in \mathbb{R^n} | Mf(x) > \lambda) \le \frac{C_n}{\lambda} \int_{\mathbb{R^n}} |f| \mathrm{d\mu}$$, where $$C_n$$ is a constant only depends on $$n$$.

It is easy to see $$C_n \le 2^n$$, but how to determine its optimal asymptotic behavior? For example, does $$C_n$$ bounded in $$n$$? Is $$C_n$$ bounded by polynomial in $$n$$?

## How can to find asymptotic form of power series?

The Power series is $$R_n(x)=\Sigma_{p=2}^{n+1}\frac{(p-1)(2n-p)!}{n!(n+1-p)!}x^p$$

It says the following:

For large $$n$$, the asymptotic form of $$R_n(1/\beta)$$ can be obtained by looking at the values of $$p$$ which dominate the sum

• $$p$$ of order $$1$$ for $$\beta > 1/2$$,
• $$p$$ of order $$\sqrt n$$ for $$\beta = 1/2$$, and
• $$p \propto (1 – 2\beta)/(1-\beta)$$ for $$\beta < i$$

See This Picture

## Asymptotic Bound on Minimum Epsilon Cover of Arbitrary Manifolds

Let $$M \subset \mathbb{R}^d$$ be a compact smooth $$k$$-dimensional manifold embedded in $$\mathbb{R}^d$$. Let $$\mathcal{N}(\epsilon)$$ denote the size of the minimum $$\epsilon$$ cover $$P$$ of $$M$$; that is for every point $$x \in M$$ there exists a $$p \in P$$ such that $$\| x – p\|_{2}$$.

Is it the case that $$\mathcal{N}(\epsilon) \in \Theta\left(\frac{1}{\epsilon^k}\right)$$? If so, is there a reference?

## Always exists an asymptotic component with large size in a minimal subshift?

Let $$(X, T)$$ be a minimal subshift, i.e. $$X$$ is a closed $$T$$-invariant subset of $$A^\mathbb{Z}$$, where $$T$$ is the shift. A pair $$x,y\in X$$ is asymptotic if $$d(T^nx, T^ny)$$ goes to zero as $$n\to\infty$$. Always exists such a pair when $$X$$ is infinite: for every $$n\geq1$$ there exists $$x^{(n)}, y^{(n)} \in X$$ such that $$x^{(n)}_0\not= y^{(n)}_0$$ and $$x^{(n)}_{[1,N]}= y^{(n)}_{[1,N]}$$ (if not, for some $$n$$, $$x_{[1,n]} = y_{[1,n]}$$ implies $$x_0=y_0$$, i.e., $$x_{[1,n]}$$ determines $$x_0$$, and this forces $$X$$ to be periodic), and any pair of convergent subsequences of $$x^{(n)}$$ and $$y^{(n)}$$ will do the trick.

My question is: do exist $$k$$-tuples of asymptotic points, for every $$k\geq1$$? More precisely, is it true that for every $$k\geq1$$ there exists $$x_1,\dots,x_k\in X$$ such that $$\lim_{n\to\infty}d(T^nx_i, T^nx_j) = 0\ \forall i,j$$