A “ Hard to Die ” Series Expansion and Lucas Polynomials of the Second Kind

We show how to use the Lucas polynomials of the second kind in the solution of a homogeneous linear differential system with constant coefficients, avoiding the Jordan canonical form for the relevant matrix.


Introduction
It is well known that an analytic function f of a matrix r r × =   , i.e. ( ) f  is the matrix polynomial obtained from the scalar polynomial interpolating the function f on the eigenvalues of  (see e.g. the Gantmacher book [1]), however, in many books (see e.g.[2]), the series expansion is assumed for defining (and computing) ( ) exp  .So, apparently, the series expansion for the exponential of a matrix is "hard to die".Let If the eigenvalues are all distinct, 1 r P − coincides with the Lagrange interpolation polynomial and (1.2) is the Lagrange-Sylvester formula.In case of multiple eigenvalues, 1 r P − is the Hermite interpolation polynomial, and (1.2) reduces to Arthur Buchheim's formula, generalizing the preceding one.This avoids the use of higher powers of  in the Taylor expansion (1.1).In any case, the possibility to write h  , ( ) 2 h ≥ , in an easy block form, requires not only the knowledge of the spectrum, but even the Jordan ca- nonical form of  .It is necessary to compute the eigenvectors and moreover the principal vectors, if  is defective.A known machinery which implies a lot of computations.
In the following, we show how to proceed in connections with the matrix ( ) exp t , which appears in the solution of the vectorial Cauchy problem


We propose an alternative method, based on recursion, using the functions , k n F , which are essentially linked to Lucas polynomials of the second kind (i.e. the basic solution of a homogeneous linear recurrence relation with constant coefficients [3] [4]), and to the multi-variable Chebyshev polynomials [5].
Another Taylor series will be used, but using only functions of the invariants of  , not explicitly all powers of  .
It is worth to recall that the knowledge of eigenvalues is equivalent to that of invariants, since the latter are the elementary symmetric functions of the former (with alternate sign).
Up to our knowledge, this is the first time that polynomials are used to solve this kind of differential problems, furthermore our method has the advantage to avoid computation of higher powers of the matrix  .The solution of the considered problem is given in terms of the , k n F functions of the invariants of  .These functions are polynomials, independent of the Jordan canonical form of  , and can be computed recursively, avoiding the knowledge of eigenvectors and principal vectors.Moreover, if the matrix is real, the , k n F functions are real as well, and possible complex eigenvalues does not affect the form of solution.

Homogeneous Linear Differential Systems with Constant Coefficients
It is well-known that a higher order differential system can be always be reduced to a first order system, hence we will limit ourselves to considering such type of systems.
For simplicity, we start off with the 2 2 x t a x t b y t y t cx t d y t assuming 0 bc ≠ , (otherwise the system is trivial), and put by definition: In the following, consider the Cauchy problem with initial conditions: ( ) ( ) x t x y t y Looking at the first equation in (2.1), we note that since the right hand side is (real or) complex analytic, the solution is (real or) complex analytic as well.Deriving side by side, we find In general we find the same recursion satisfied by the powers of the matrix  [6], i.e.
where the coefficients ≥ − are essentially Lucas polynomials of the second kind (see [3]- [5] [7]), defined by relation: , , , , and the initial conditions: It is easily shown that the second function ( ) and using Taylor expansion, the solution of the Cauchy problem (2.1) -(2.2), can be found in the form: The above result can be put in vectorial form, in order to be generalized.
Introduce the matrix ( ) then, the solution in vectorial form reads ( ) ( ) Note that the convergence of the vectorial series in any compact set K of the space ( ) , u v is guaranteed, since the components of 1 n F − are polynomials of weight not exceeding 1 n − , and consequently are bounded in K.
In the following section, we will extend this solution to the general vectorial case.Remark 2.1 Note that Equation (9) does not use all powers of matrix  , but only the , k n F functions of its invariants, which are simply computed by recursion, a worthwhile improvement of the computing cost.

The 3D Case
Now, we consider also the case of the 3 3 a a ≠ (so that the system cannot be reduced to a 2 2 × or trivial system), and put by definition: We consider, the Cauchy problem with initial conditions: x t x y t y z t z By using the same technique as in the 2 2 × case, we find and by iterating the procedure we obtain, for example, and so on.In general we find the same recursion satisfied by the powers of the matrix  [6], i.e.
where the coefficients ≥ − are the generalized Lucas polynomials [4], defined by the recurrence relation: and the initial conditions: The second and third function ( ) The above result can be put in the following vectorial form.Let Introduce the matrix ( ) 2), with respect to traditional methods, as reported e.g. in [2].

The General Result
x on Σ , i.e. such that:

Remark 3 . 1
Even in this case, the considerations of Rem.2.1 still hold, showing a more convenient form of computing solutions of the Cauchy problem (3.1) -(3.

Theorem 4 . 1
Consider the Cauchy problem for a homogeneous linear differential system t satisfies the same recursion (2.5).