Improving time complexity from O(log n/loglog n) to O((log ((nloglog n)/log n))/loglog ((nloglog n)/log n))


Suppose I have an algorithm whose running time is $ O(f(n))$ where $ f(n) = O\left(\frac{\log n}{\log\log n}\right)$

And suppose I can change this running time in $ O(1)$ steps into $ O\left(f\left(\frac{n}{f(n)}\right)\right)$ , i.e. I can get an algorithm whose running time is $ O(g(n)) = O\left(\frac{\log\frac{n}{\frac{\log n}{\log\log n}}} {\log\log\frac{n}{\frac{\log n}{\log\log n}}}\right) = O\left(\frac{\log\frac{n\log\log n}{\log n}} {\log\log\frac{n\log\log n}{\log n}}\right)$ .

I’m pretty sure that $ g(n) < f(n)$ for big enough $ n$ (by using wolfram alpha) but wasn’t able to prove it.

My questions are:

  1. Is $ g(n) < f(n)$ in fact true (starting from some n)?

  2. Is $ g(n)$ asymptotically better the $ f(n)$ , i.e. is $ g(n) = o(f(n))$

  3. Assuming this is asymptotically better, I can do this step again and further improve the running time of the algorithm. Meaning that in 1 more step I can make my algorithm run in time of $ O\left(\frac{n}{f\left(\frac{n}{f(n)}\right)}\right)$ , and I can repeat this process as many times as I want. How many times should the process be repeated to get the best asymptotically running times and what will it be? obviously repeating it $ O(f(n))$ times will already have a running time of $ O(f(n))$ only for the repetition of this process and will not improve the overall algorithm complexity.