Difficulty understanding the use of arbitrary function for the worst case running time of an algorithm


In CLRS the author said

"Technically, it is an abuse to say that the running time of insertion sort is $ O(n^2)$ , since for a given $ n$ , the actual running time varies, depending on the particular input of size $ n$ . When we say “the running time is $ O(n^2)$ ,” we mean that there is a function $ f(n)$ that is $ O(n^2)$ such that for any value of $ n$ , no matter what particular input of size $ n$ is chosen, the running time on that input is bounded from above by the value $ f(n)$ . Equivalently, we mean that the worst-case running time is $ O(n^2)$ . "

What I have difficulties understanding is why did the author talked about an arbitrary function $ f(n)$ instead of directly $ n^2$ .

I mean why didn’t the author wrote

"When we say “the running time is $ O(n^2)$ ,” we mean that for any value of $ n$ , no matter what particular input of size $ n$ is chosen, the running time on that input is bounded from above by the value $ cn^2$ for some +ve $ c$ and sufficiently large n. Equivalently, we mean that the worst-case running time is $ O(n^2)$ ".

I have very limited understanding of this subject so please forgive me if my question is too basic.