Bounds for Polynomial’s Roots from Fiedler and Sparse Companion Matrices for Submultiplicative Matrix Norms

We use submultiplicative companion matrix norms to provide new bounds for roots for a given polynomial ( ) P X over the field [ ] X  . From a n n × Fiedler companion matrix C, sparse companion matrices and triangular Hessenberg matrices are introduced. Then, we identify a special triangular Hessenberg matrix r L , supposed to provide a good estimation of the roots. By application of Gershgorin’s theorems to this special matrix in case of submultiplicative matrix norms, some estimations of bounds for roots are made. The obtained bounds have been compared to known ones from the literature pre-cisely Cauchy’s bounds, Montel’s bounds and Carmichel-Mason’s bounds. According to the starting formel of r L , we see that the more we have coefficients closed to zero with a norm less than 1, the more the Sparse method is


Introduction
Approximating of roots for polynomials has been subject of several studies in mathematics and physics especially stability studies of dynamic system and automatism. There are various techniques for approximating the roots of a polynomial [1]. Some algorithms for determining the roots of a polynomial [2] rely on a good first approximation by using the coefficients of the polynomial [3].
Other methods for finding the roots of a polynomial ( ) bound for polynomial's roots consists of identifying a special form of Hessenberg matrix that could give an improved value of bound in case of submultiplicative matrix norms. In the third part, through illustrative examples, we make a comparison of classical bounds to those obtained with the Sparse method.
Indeed, the Sparse method is well appropriated to estimate bounds: it gives the opportunity to estimate improved bounds for high as well as minor values of coefficients for polynomial's degree 5 n = .

Bounds for Roots of Polynomial Based on Fiedler
Companion Matrices

Generalities on Fiedler Companion Matrices
Let A be the associated companion matrix of ( ) P X .

Principe of Providing Bounds for Polynomials Roots
The method of providing bounds for roots of polynomial ( ) P X consist in applying the following Gershgorin's theorem to Fiedler companion matrices, for any λ ∈  , root of ( ) P X , to determine the nonnegative numbers ( ) ( ) U P depending on the coefficients of ( ) P X such as [7]: We formally state the Gershgorin's theorems and its application to companion matrices, in order to introduce the 1-norm and the ∞-norm. Theorem 1. (Gershgorin's Theorem and the 1-norm and ∞-norm) [6]. Let: ( ) ij A a = be a n-by-n complex matrix, λ ∈  an eigenvalue of matrix A.

1,
the modulus of non diagonal elements of the column j.
All the eigenvalues λ of A are located in the union of the disks: For each eigenvalue λ we have: This is equivalent to the 1-norm: 1 A . In the same way, we obtain the

Classical Bounds for Complex Polynomials Based on Gershgorin's Theorems
The inequalities of theorem 1 have been used to get following classical bounds.
As far as the value of the bound for roots of a polynomial depends on the seleted companion matrix, let us introduced types of companion matrices that can hopefully improved the bound's value.

Sparse Companion Matrices
With the Fiedler method, developped in paragraph 2, bounds for roots of polynomials were determined by using the Fiedler companion matrices based on the products of block diagonal matrices introduced by Fiedler. Here we will take advantage of particular matrices called Sparse matrices in their special form precisely the Hessenberg form of the Fiedler companion matrices.

Generalties on Sparse Companion Matrices
We've seen that the Frobenius matrix is itself a Fiedler matrix. Sparse companion matrices has been recently characterized and they includes the Fiedler matrices as a special case.
Definition 2: the Frobenius companion matrix has exactly 1 n − of the constant entries set to 1, the remaining ( ) 2 1 n − constant entries are zero. As noted in [1], while a companion matrix must have at least 2 1 n − nonzero entries, there are companion matrices that have more than 2 1 n − nonzero entries. As such, we will say that a companion matrix is Sparse if it has exactly 2 1 n − nonzero entries. Sparse companion matrices have also been called intercyclic companion matrices because of an associated digraph structure as noted in [1].
If the ( ) ( ) We have: This form of matrix will be used to determin new bounds for roots of poly-

Improved Bounds for Roots of Polynomials by Using Fiedler Companion Matrices in Their Hessenberg Form in Case of Submultiplicative Matrix Norm
Due to the particular form of the unit lower Hessenberg matrix, that gives good possibilities for operations, it is important to check for a matrix in this form that provides good bounds. Thus, Vander Meulen establishes in [2] that there is a particular Fiedler matrix which provides the best upper bound for over all Fiedler matrices for the ∞-norm. This is obtained by using the following Hessenberg matrix. Let b L be the following matrix: L is a Fiedler companion matrix in the form of Hessenberg [1]. This matrix is supposed to provide a good majoration of the eigenvalue for the infiny norm.
Theorem 4. Let P be a polynomial, F be a Fiedler companion matrix of P in the Hessenberg form with b a appearing in row n for some L be a Fiedler companion matrix in the form of Hessenberg.
Let r ∈  so that: Then for all Thus for submultiplicative matrix norms, we have: Proof.: Equation (12) and (13) are respectively from theorems 4.1 and 4.2 in [1].
There are no other coefficients of P in the same row as g a in b L but there may be more coefficients in the same row as g a in b F . Therefore Suppose b r > . Then  ( ) Consequently we have: C λ ≤ for every eigenvalue λ of matrix C. [1] Since the eigenvalues of a companion matrix ( ) C P are the same as the roots of P, this method can be used to find bounds for the roots of P.
So that:


Through illustrative examples in case of polynomial by degree 5 n = , we can try to see how sharp this theorem is.

Determination of Bounds for Roots of Polynomials for n = 5
In this section we make a particular study on polynomial for degree Advances in Linear Algebra & Matrix Theory using theorem 4 about sparse companion matrices. The result will be compared to bounds from Cauchy, Montel nd Carmichel-Mason's theorems [3].

Examples of Polynomials for n = 5
Example 2: We have:   The roots and their bounds for example 2 are illustrated in Table 2, Table 3 and Figure 1 Table 4, Table 5 and Figure 2 [6]. 1.0027462291 λ ≤ Table 5. Roots of given polynomial P computed through Scilab.

Results
In this section, we investigate about the Sparse method of providing improved bounds of roots for polynomials. We compare bounds of roots of polynomial obtained by the Sparse method, to those found in the literature. According to the starting formel given in (12), we saw that the more we have coefficients closed to zero with a norm less than 1, the more the Sparse method is useful. Indeed for In the above studied cases, each considered classical method couldn't give itself sharpest bounds for high and minor values of the given polynomial's coefficients. Those methods are appropriated for high values or minor values of the coefficients but not for both of them. By using the Sparse matrices, the obtained bounds belong each time to the sharpest ones in the cases of high and minor coefficient's values of a the given polynomials.

Conclusions
Explicit expressions of bounds of roots for polynomials are obtained by using Sparse Fiedler companion matrices for the . ∞ . Classical bounds obtained by using Gershgorin's Theorem (Cauchy's bounds, Montel's bounds, Carmichel-Mason's bounds) and those obtained by using Sparse companion matrices are presented and compared. By using special Sparse matrices, the obtained bounds belong each time to the sharpest ones in the cases of high and minor coefficient's values of the given polynomials due to a good estimation of the unit lower Hessenberg matrices for Next steps should include other degrees of polynomials for degrees 5 n > .