convergence rate of newton’s method

So, I’m currently studying Newton method used for finding the 0’s of a function, however my professor has only announced that the speed of this algorithm can be more than quadratic, however I’m wondering when this happens, since in the demonstration used by him to explain the quadratic case, there is no evidence about the "more than quadratic"

Using Taylor and Lagrange
$ f(\xi)=f(x_n)+f'(x_n)(\xi-x_n)+\frac{f”(z_n)}{2}(\xi-x_n)^2$

$ -\frac{f(x_n)}{f'(x_n)}=\xi-x_n+\frac{f”(z_n)}{2f'(x_n)}(\xi-x_n)^2$

$ x_{n+1}-x_n=\xi-x_n+\frac{f”(z_n)}{2f'(x_n)}(\xi-x_n)^2$

$ e_{n+1}=\left|x_{n+1}-\xi\right|=c_ne_n^2 \quad \text{with} \quad c_n=\frac{1}{2}\frac{\left|f”(z_n)\right|}{\left|f'(x_n)\right|}$

Can someone please tell me when (example) the order of the convergence is cubic, and why?=

Is Newton’s algorithm really this much better than conjugate gradient descent?

I have a function I’m minimizing. I’m using conjugate gradient descent and the Newton algorithm.

I am experiencing that the Newton algorithm is absurdly faster. Like, it finishes it 5-6 iterations, while the conjugate gradient takes 2000 iterations (and regular gradient descent takes 5000 iterations).

I know what’s causing the problem too: the learning rate. In Newton’s method, a learning rate of $ \alpha = 1$ works. But I can’t use this in the gradient descent methods, where such a choice of $ \alpha$ diverges. In these methods, I am forced to use $ \alpha = 0.01$ .

But anyways, 5 iterations vs thousands of iterations is still an absurd difference. Is Newton’s method really this good??!

Polynomial – using Newton’s method, or not?

I have one problem that is expressed as polynomial of 2…n degree. In order to solve it I can use general Newton’s method for all degrees, or only for 5th+, and for 2,3 and 4 I can use algebraic equations (quadratic, cubic and quartic equations).

Since algebraic equations need to use sqrt, acos and similar functions – does it make sense at all to use algebraic equations – or is it better to use Newton’s method for all solutions? I guess it will actually be faster?

UPDATE

This is my polynomial:

$ $ \sum_{i=0}^nf^i\binom{\{b_1,…,b_n\}}{i}(-k+n-i), n>=2$ $

And I need to write code to calculate $ f$ for different values of $ n$ .

Proving correctness of the Newton’s Method for finding the square root of a number

I’m trying to prove the correctness of this simple square root calculation algorithm using SPARK:

Y := X / 2.0; while abs (X - Y ** 2) > Tol * X loop     Y := 0.5 * (Y + X / Y); end loop; return Y; 

The preconditions are that both X and Tol are greater then zero and the postcondition is simply the opposite of the while loop’s condition.

Are there any invariants of the loop above that may help? Or maybe a different algorithm (eg. bisection) would be a better choice? So far I’ve tried showing that the value of the square root is always between Y and X / Y but that didn’t get me anywhere.

Newtons law of cooling applied to sphereical region

I’m having trouble solving the following problem:

Formulate a mathmatical model for a stationary (steady) temperature distribution inside the spherical volume

$ R^2\leq x^2+y^2+z^2\leq (2R)^2$ ,

where R is given constant. The region is homogenous and the boundary $ x^2+y^2+z^2=R^2$ has constant temperature $ T=T_0$ . Newton’s law of cooling describes the temperature at the other boundary $ x^2+y^2+z^2=(2R)^2$ (the normal component of the heat diffusion is proportional to the difference of the boundary temperature and the temperature of the region outside, $ T_1$ .

I found this formula on the heat equation for a spherical region:

$ \frac{\partial T}{\partial t}=\alpha \frac{1}{r^2}\frac{\partial}{\partial r}(r^2\frac{\partial T}{\partial t}), 0<r<r_0$

But this is for a spehere with no inner boundary, so I am lost at how I apply this formula for my case. But the model seems correct, if I am correct this is derived from Laplace equation using spherical coordinates, correct me if I am wrong. Or is it Newton’s law in spherical coordinates?

Best regards