Left Eigenvector of a Stochastic Matrix

We work in the free field generated by these variables (in the sense of Paul Cohn), which we call the stochastic free field. Considering the complete digraph on the set , let i {1, , } n    M be the set of paths from to i . Let i be the paths starting from and not passing through again. We identify i with the noncommutative power series which is equal to the sum of all the words in i and we still denote this series by i . Next we show that the elements i P i i


Abstract
We determine the left eigenvector of a stochastic matrix associated to the eigenvalue 1 in the commutative and the noncommutative cases.In the commutative case, we see that the eigenvector associated to the eigenvalue 0 is , where is the principal minor of , where is the identity matrix of dimension .In the noncommutative case, this eigenvector is , where is the sum in of the corresponding labels of nonempty paths starting from and not passing through in the complete directed graph associated to .

Introduction
It is well known that 1 is one of the eigenvalue of a stochastic matrix (i.e. the sum of the elements of each row is equal to 1) and its associated right eigenvector is the vector .But, what is the left eigenvector associated to 1?


In the commutative case, this eigenvector is the vector 1 ( , , ) , where i M is the i principal minor of the matrix.This formula is known in probability theory; it amounts to finding the stationary distribution of the finite Markov chain whose transition matrix is the stochastic irreducible matrix.

th 
In the noncommutative case, we must involve inverses of elements of the skew field and as these may be undefined, we take a generic noncommutative stochastic matrix: this is the matrix of noncommuting variables ij subject only to the stochastic identities; i.e. the sum of each row equals one.

( ) ij a a
We work in the free field generated by these variables (in the sense of Paul Cohn), which we call the stochastic free field.Considering the complete digraph on the set , let i {1, , } n  M be the set of paths from to i .Let i be the paths starting from and not passing through again.We identify i with the noncommutative power series which is equal to the sum of all the words in i and we still denote this series by i .Next we show that the elements i P i i P P P 1 i P  can be evaluated in the stochastic free field and that the vector is fixed by our matrix; moreover, the sum of the 1 1 ( , , n P P     is equal to 1, hence they form a kind of noncommutative limiting probability.
These results have been proved in [1] but the proof proposed in this paper are completely different.
Indeed for the major part the proof in these two instances, we use elementary operations on the rows and the columns of the matrix.

Commutative Case
The commutative case is well known in probability theory.Indeed, we calculate the limit probability of a finite Markov chain by replacing the stochastic matrix by M  M I , where is the identity matrix of the appropriate dimension.For this, we use the Markov chain tree Theorem, where this calculus is expressed in terms of spanning trees.The Markov chain tree theorem gives a formula for the stationary distribution of a finite Markov chain.Equivalently, this formula gives a row vector fixed by a matrix fixing .This theorem is attributed to Kirchoff by Persi Diaconis, who gives a probabilistic proof of it (see [2] p. 443 and 444).See also [3,4].
The proof uses only the definition of the determinant which involves the principal minors.
Proposition 1 Let be a stochastic matrix and , where is the i     ) If is reducible, then this proposition is asserting that the zero vector ia an eigenvector.If is irreducible and stochastic, then its adjoint will be of the form , where is a null vector.Te pro- position follows from the cofactor formula for the adjoint.

Noncommutative Case
In [1], the authors have prove in two manners that are actually similar, Theorem 9.In the first, results from variable-length prefix codes are required; and the second deals with general variable-length codes, not necessarily prefix.Moreover, in Appendix 2 of [1], we see how the theory of quasideterminants may be used to obtain these results on noncommutative matrices.The major part of the new proof of this Theorem involves only the elementary operations on the rows and on the columns of the stochastic matrix.Therefore, we need the following results.

Languages and Series
Let A be a finite alphabet and * A be the free monoid generated by A .A language is a subset of a free monoid * A .A language is rational if it is obtained from finite languages by the operations (called rational) union, product (concatenation) and star.The product of two languages is , w w w L w L   , and the star is rational.We shall identify a language and its characteristic series.This is exposed in [5] or [6]. .The free field is the noncommutative analogue of the field of fractions of the ring of commutative polynomials.There are several constructions of the free field: Amitsur [7], Bergman [8], Malcolmson [9], Cohn [10] and [11].
Proposition 6 Take , and , where is a skew field.We suppose that  M is invertible.Then 0 be its in we have Hence 1 verse.Then is proposition, we d e follo ry 7 Let From th educe th wing result.Corolla , is by elementary operations on the lines and on the columns to a hollow

Let be a generic noncommutative matrix;
tive var ld.We equivalent matrix) then in the free field.

 
i.e. the ij a are noncommuta iables.We denote the corresponding free fie associate to M 1 , = ij i j n a   M  the matrix S : it's exactly the same matrix of which the coefficients satisfy the dentities stochastic i In other words, the sum of each line of S is equal to 1; hence S is a stochastic matrix.We call S a generic noncommutative stochastic matrix.

Paths
Consider the set of nonempty paths in the complete directed graph with a set of vertices   . The graph is We shall also consider rational expressions over any I a a ctua no cal embedding of into seen as follows: let tional ace in it the skew field D , and say that such an expression is evaluable in D if it can be evaluated without inversion of 0. f the elements of D appe ring in the ration l expression lly in a subring R of D , we say that the expression is over R . There

M
From this syste he two following equa-m, we deduce t tions: Hence, Equation ( 3) is satisfied.From the Equation (4), we have i S be the system composed of the Equations , 1 Mult lying to the left by ip i P  each equation of we obtain the system : This system transforms into the following system We show that We show that .So Equation ( 6) is converted into the following Equation 0 (7) uation (7) and the systems .We have the matrical representation To show that , we show that the matrix = 0 R F is hollow.For this, we apply elementary operations on the rows and on th ns of e colum F Step 2. We want a until F co a ntains s t  block of 0 s uch that 2 n   3 s t  .
Step 1.We eliminate the first rowe and the last column of F .We obtain the square matrix 1 F of order ).
. We reduce Hence, for all , we calculate .We obtain the matrix Step 6.
, , the left eigenvector associated to 1. L be the language of start from i to j .We have We have the sy , , = , , P P P P P P Under the matrical system, we have We have ,0,0,0,0,0,0,0 where and E with stochastic iden-tities equal to   , we show that F is hollow.
We have a block of zero with .It follows that   Theorem 12. Let be defined as above, then in we have (8) Proof.Define

Conclusions
One of the contribution of this article is the using of elementary operations on the lines and the columns.This method provides a new way to obtain some results in skew field theory with a minimum of knowledge of this theory.Moreover, Theorems 9 and 12 are proved exactly the same way.

d
Since we again obtain Equation(5), it follows that Equation (4) is also satisfied.Proof of Theorem 9. Let be the matrix of the automaton and be the langua starting fr and not passing through .We have ,

1 F. 1 F
block of 0 in the right upper corner of 1 F .For this, we consider the row of jth We add each one of these rows to the jth ro

F 3 inStep 3 . 3 F
removing the last line column if we start from the end.We atrix and the  have the Consider the rows of i L , where and .We want tha the last me ws to be e to > i n elei and the ( 1 n )  last columns of 4 F to obtain the matrix 5 F of order ( s t 2 n 2 n     ).

5 F
where m  D is the matrix obtained from m D in remo-F .Finally we want that the first n elements t column of th of e firs to transformati be all 0; for this, we apply the on 1 2 c c c   n .We obtain the matrix 6

1 P
be the language recognized by automaton the Let ij prove the next result.

.
We repeat each step of the proof of Theorem 9, except for the last operation of step 4, which concerns the first column.Again, we show that  F