## in the lambda calculus with products and sums is $f : [n] \to [n]$ $\beta\eta$ equivalent to $f^{n!}$?

$$\eta$$-reduction is often described as arising from the desire for functions which are point-wise equal to be syntactically equal. In a simply typed calculus with products it is sufficient, but when sums are involved I fail to see how to reduce point-wise equal functions to a common term.

For example, it is easy to verify that any function $$f: (1+1) \to (1+1)$$ is point-wise equal to $$\lambda x.f(fx)$$, or more generally $$f$$ is point-wise equal to $$f^{n!}$$ when $$f: A \to A$$ and $$A$$ has exactly $$n$$ inhabitants. Is it possible to reduce $$f^{n!}$$ to $$f$$? If not, is there an extension of the simply typed calculus which allows this reduction?

## What is $f(n)$ in $NTIME(n)\subseteq DTIME(f(n))$ if $CIRCUITSAT$ is in $P$?

If $$CIRCUITSAT$$ in $$n$$ variables and $$m$$ gates has an $$O((nm)^c)$$ algorithm for a fixed $$c>0$$ then $$NTIME(n)\subseteq DTIME(O(f(n)))$$ for large enough $$f(n)$$. What is the smallest $$f(n)$$ in $$NTIME(n)\subseteq DTIME(O(f(n)))$$?

## Solving recurrence relation where the $f(n)$ has some constant factor $k$ where $0 < k < 1$

I am trying to see if a recurrence relation where $$f(n)$$ has some constant factor $$k$$, e.g. $$f(n)=kn$$ where $$0 < k < 1$$, is $$O(n)$$. I am reaching a different result depending which route I take. Given the following recurrence relation:

$$T(n)=2T(\frac{n}{2})+f(n)$$ $$T(n)=2T(\frac{n}{2})+kn$$ Since $$0 < k < 1$$, we can represent $$kn=n^c$$, where $$0 < c < 1$$

This falls under the case 1 of the Master Theorem, because $$a=2, b=2$$, and therefore $$log_b a = log_2 2 = 1 > c$$.

It’s $$O(n)$$.

But if I try to unfold the recurrence: $$\begin{split}T(n) & = 2T(\frac{n}{2})+kn \ & = 4T(\frac{n}{4})+2kn \ & = 8T(\frac{n}{8})+3kn \ & = … \ & = 2^cT(\frac{n}{2^c})+ckn \ \end{split}$$

When $$\frac{n}{2^c}=1$$, $$n=2^c$$, then $$log_2 n = c$$.

So now it’s $$T(n) = nT(1) + kn log_2 n$$, which is $$O(n log_2 n)$$. Now I am confused.

## Why is the run time of an $f(n)$ space decider bounded by $2^{O(f(n))}$?

In the proof of Savitch’s theorem from the 3rd edition of Sipser’s Intro to Theory of Computation, Sipser claims that the maximum time that an $$f(n)$$ space nondeterministic Turing machine that halts on all inputs may use on any branch of its computation is $$2^{O(f(n))}$$. However, I don’t see why such a machine couldn’t run for an arbitrary (but finite) number of steps in one of its branches. For instance, consider the following linear space machine for deciding SAT: on input $$\phi$$ rewrite the contents of the first tape cell $$2^{2^{n}}$$ times, then evalaute $$\phi$$ on every possible truth assignment. This machine runs in linear space (since it doesn’t need to visit anything beyond the second tape cell for the first part of its execution), but its run time exceeds $$2^{O(n)}$$.

Despite the similar titles, my question is not a duplicate of this one. The confusion in the linked question is about the constants that result from using an arbitrary alphabet. The author admits that they understand the $$2^{O(f(n))}$$ time bound for machines that use a binary alphabet (which is precisely what I don’t get), and therefore none of the answers address my question.

## Master theorem: When a $f(n)$ is smaller or larger than $n^{log_b a}$by less than a polynomial factor

I was revising master theorem from https://brilliant.org/wiki/master-theorem/ and I was trying to solve a question.

Which of the following asymptotically grows faster.

(a) $$T(n) = 4T(n/2) + 10n$$

(b) $$T(n) = 8T(n/3) + 24n^2$$

(c) $$T(n) = 16T(n/4) + 10n^2$$

(d) $$T(n) = 25T(n/5) + 20(nlogn)^{1.99}$$

(e) They all asymptotically the same

My calculation says, (a) is $$\theta(n^2)$$ (b) is $$\theta(n^2)$$ (c) is $$\theta(n^2logn)$$. Now how can I evaluate (d)?

If $$f(n)$$ is smaller or larger than $$n^{log_b a}$$by less than a polynomial factor, how can I solve T(n)?

## Is $f(N)$ countable?

Consider a countable set $$N$$ which the function $$f$$ maps to a set $$M$$. I want to prove that the set $$M$$ also is countable, if $$N$$ is. The thing I am unsure about, is if $$f(N)=M$$?

Here is my proof, though:

Proof. Let $$N \subset X$$ be a countable set and $$f:X \rightarrow Y$$.

Therefore, $$N\cup f(N)=\left \{ x_1, f(x_1), x_2, f(x_2),…,x_n,f(x_n) \right \}$$ which can be expressed as

$$N\cup f(N)=\left \{x_1,f(x_1) \right \}\cup\left \{x_2,f(x_2) \right \}\cdots \cup\left \{x_n,f(x_n) \right \}$$.

Theorem (*), picked from Set Theory and Matrices by I. Kaplansky : A countable union of countable sets is countable.

Since we have a union of $$n$$ sets in which there is two elements, $$N\cup f(N)$$ is therefore countable according to (*).

Since $$N$$ is defined as countable and $$N\cup f(N)$$ is countable, thus $$f(N)$$ is countable.

The question is though, is $$f(N)$$ referred as “the same thing” as $$M$$? If the purpose is proving that an arbitrary set $$M$$ is countable, I am totally lost.

Does anyone here have any suggestions?