## Can an algorithm with $\Theta(n^2)$ run time be faster than an algorithm with $\Theta(n\log n)$ run time?

This is a question posted for extra practice (i.e., not for credit):

Can an algorithm with $$\Theta(n^2)$$ run time be faster than an algorithm with $$\Theta(n\log n)$$ run time? Explain.

I’m not sure how to approach it. I see lots of near-answers online using $$O(\cdot)$$, but $$\Theta(\cdot)$$ is a little different because it means there is both an upper and a lower bound of $$n^2$$ and $$n \log(n)$$, respectively.

So, at its absolute best case, a $$n^2$$ algorithm may run in $$\frac{1}{a}n^2$$ time, which for large $$a$$ may be very fast compared to most quadratic algorithms. However, since $$a$$ is constant, as $$n \to \infty$$, the time for even a $$\frac{1}{a}n^2$$ algorithm will far surpass a $$bn\log(n)$$ algorithm, even if $$b$$ is very large.

This would lead me to believe the answer is no, an algorithm that runs in $$\Theta(n^2)$$ cannot run faster than a $$\Theta(n \log n)$$ algorithm when analyzed asymptotically as I have done.

With this said, if the thing that makes the $$\Theta(n \log n)$$ algorithm $$\Theta(n\log n)$$ happens very frequently but the thing that makes the $$\Theta(n^2)$$ algorithm $$\Theta(n^2)$$ does not happen frequently at all (either due to the nature of the algorithm or optimized design), then the amortized complexity of the $$\Theta(n^2)$$ algorithm may in fact be less than that of the $$\Theta(n \log n)$$ algorithm — even if the asymptotic complexity says otherwise — right?

If the two complexities given were $$O(\cdot)$$ rather than $$\Theta(\cdot)$$ then I may be inclined to side with the latter argument. The presence of the lower bound that $$\Theta(\cdot)$$ implies is giving me second thoughts though. Can anyone help point me in the right direction here and clarify if it seems that I am misunderstanding the use of $$\Theta(\cdot)$$ for asymptotic versus amortized analysis.

Edit: I don’t think this is a duplicate of Sorting functions by asymptotic growth. I am familiar with how to use limits, etc. to classify and sort algorithms asymptotically. Speaking strictly in terms of asymptotics, I think the answer to my question is definitely no. The question quoted (as worded) seems more general though; it wants to know if there is any way that a $$\Theta(n^2)$$ algorithm can out perform a $$\Theta(n \log n)$$ algorithm. Is such an occurrence possible when considering amortized analysis rather than asymptotic analysis and the algorithms are structured as described above? Or is this irrelevant and the answer is still no?