High-Order Iterative Methods Repeating Roots a Constructive Recapitulation

This paper considers practical, high-order methods for the iterative location of the roots of nonlinear equations, one at a time. Special attention is being paid to algorithms also applicable to multiple roots of initially known and unknown multiplicity. Efficient methods are presented in this note for the eva-luation of the multiplicity index of the root being sought. Also reviewed here are super-linear and super-cubic methods that converge contrarily or alternatingly, enabling us, not only to approach the root briskly and confidently but also to actually bound and bracket it as we progress.


Introduction
This note is mainly concerned with the numerical approximations of function roots by various iterative methods [1]- [8]. A multiplicity greater than one of a function root may greatly impede the convergence method used to iteratively locate it. The multiplicity index m of the sought root is often unknown before hand. It may well be a latent property of the root, not cursorily revealed, nor readily available. In this note, we examine and assess computational procedures to evaluate the multiplicity index m of the iteratively approached root. We further constructively review the iterative methods [9] [10] [11] for approaching a root of a known or unknown multiplicity. Then we suggest several other useful, alternatingly converging and contrarily converging iterative methods to bound the sought root.

The Taylor Representation of an Osculating Function
Approximation by polynomials is fundamental to numerical analysis. Taylor's theorem plays in a crucial role.
The monomial power function is ever smaller, as n is ever bigger, near 0 (1) is actually such that Such function is said to be osculating of degree n.
In its most concise form, Taylor's theorem generalizes this claim to any differential function. It asserts that if Equation (2) holds true for any other differential function ( ) f x , then this function is expressible as for x to the right of zero. Equation (3) is said to be the Taylor form of osculating function ( ) f x of Equation (2). Power n is said to be the degree of osculation of ( ) We take in Equation (3) ( ) 1 , 0 n n f x Ax Bx A + = + ≠ (4) and obtain from it the asymptotic form of intermediate value ξ See also [12] [13].

Repeated Taylor Expansion Examples
Example one. We start with x r x = + + + (12) In view of Equation (11) the Taylor form of ( ) or further, in view of Equation (11), approximately, if x is small ( ) Here Nearly. We assemble Equations (12), (13) and (14) to gain the better polynomial approximation becoming with ξ of Equation (15) ( ) ( ) 2  3  4  5  5   1  1  1  1  1  e 1  ,  0  .  2  6  24 192 320 Example two. We write and fix parameters 0 1 , c c so that and have the osculating remainder or residual The Taylor form of this osculating residual ( ) Here Example three. We start with and fix parameters 0 c and 1 c so as to have both ( ) The

Estimation of the Root Multiplicity Index
Assuming that the power series expansion of function ( ) f x , whose root a we are seeking to iteratively locate, is of the form we obtain from it the first-order estimate for the root multiplicity index m ( ) 1 2 , We have also that, whatever k By the Padé rational approximation we have the first-order estimate We notice that, whatever m, root a of u f f ′ = is isolated, namely always of

Second Order Iterative Methods
Our prevailing task is, at first, to stepwise and steadily, approach isolated root a and readily obtain from it, by the linearization The Newton method is a quadratic iterative method to an isolated root

Rectification for a Multiple Root by Undetermined Coefficients
We recast Newton's method in the open-ended form with the undetermined coefficient P, and have by Equation (32) that at a root of Quadratic convergence is restored to the method, as is well known, with P m = . In the equation above, A and B are the coefficients in the power series expansion of ( ) f x in Equation (32). But, for this, we need to know in advance the multiplicity index m of root a.
To have a quadratic method not needing the a-priori knowledge of multiplicity index m of the searched root we take the m approximation of Equation (33) and obtain, the quadratic for any m, method Method (48) is also obtained by applying Newton's method to u f f ′ = . See Equation (42).
The advantage of method (48) is that it is quadratic; its drawback is that it re- we have by method (48) that ( ) for root 0 a = of any multiplicity index m.

Two-Step Method
To avoid the computation of f ′′ in the modified Newton method (48), we suggest the two-step method To observe its working we apply the method to the trial function and then ( )

Third Order Iterative Methods-and Higher
We propose to replace now 0 x ξ = in Newton's method by the better to have the mid-point method which now rises in order to become cubic 2 reproduces out of Equation (56) the classical method of Halley which is likewise cubic which is still cubic provided that Moreover, We generalize the method in Equation (63) as and recover a one-sided cubic convergence method to a root, even of multiplicity m, for ( )

Correction of Halley's Method for Multiple Roots
We rewrite Halley's method of Equation (59) with the open coefficients P and Q as 0 and determine by power series expansion that if (See also [11])

A Third-Order, One-Sided, Chord Method
Having secured , We take the root of ( ) 0 g x = , as the next 2 x , and have ( ) which are, all three, cubic, respectively. See also Traub [1] page 180 Equations (8-55).
Convergence of this method is one sided, namely, if 0 0 x a − > , then also

Quartic and Quintic Methods
Once 2 x is computed by Equation (73), just one additional computation of ( ) f x makes the pseudo-Newton's method From a parabola passing through the data ( ) 2 which we verify to be quintic . 12

Fourth-Order, Parabolic Interpolant Method
We can do still better with a parabolic interpolation that includes, the already evaluated, 0 f ′ . We pass through the data ( ) , , , where ( ) We take the nearest root of ( ) 0 q x = , as 2 x and have, by expansion in powers of r, the ultimate quartic method ( ) Method (84)  . 24 See in particular Traub [1] page 184 Equations (8-78).

From a Cubic to a Quartic Method
We write the second order, 2 n = , version of the Taylor-Lagrange formula and take We approximate the solution of the increment equation Pf Qf and so on.
The methods are both (the first is a Chebyshev method) cubic 1 , 3 respectively, provided that Since here 2 n = we propose to try . 72

Oppositely Converging Super-Quadratic Methods
We start here with We ask that ( ) This super-quadratic method converges from above if 0 k > , and from below if 0 k < .

The interest in the method
is that it ultimately converges oppositely to Newton's method,

Alternatingly Converging Super-Linear and Super-Cubic Methods
We start by modifying Newton's method into for any k, and for which indicating that, invariably, the method converges, at least asymptotically, alter- and root 0 a = is bracketed as 5 4 2.6706 10 2.1406 10 . a For a higher order method we start with, the originally quartic . x

Higher-Order Methods, Multiple Root
We reconsider the typical third-order method in Equation (61), but with the arbitrary coefficients P and Q  To have a cubic method that does not require an a-priori knowledge of the root multiplicity, we replace f by u to have Still higher order methods are readily thus generated. See also [14] [15] [16] and [17].

Conclusions
We have demonstrated in this paper the usefulness of a number of practical, high order methods for the iterative location of the roots of single-variable nonlinear equations. We have presented here algorithms to estimate the multiplicity of a considered root. Special attention was given here to algorithms thriftily applicable to multiple roots of initially known and unknown multiplicity.
We have further demonstrated here the advantage of super-linear and supercubic methods that converge contrarily or alternatingly, enabling us, not only to confidently approach the root, but to also actually bound and bracket it as we progress.