Complexity of numerical derivation for general nonlinear functions


In classical optimization literature numerical derivation of functions is often mentioned to be a computationally expensive step. For example Quasi-Newton methods are presented as a method to avoid the computation of first and/or second derivatives when these are “too expensive” to compute.

What are the state of the art approaches to computing derivatives, and what is their time complexity? If this is heavily problem-dependent, I am particularly interested in the computation of first and second order derivatives for Nonlinear Least Squares problems, specifically the part concerning first order derivatives (Jacobians).