Existence of Periodic Solutions for an Output Hidden Feedback Elman Neural Network

We first recall the sufficient conditions for the existence of a periodic output of a modified Elman neural network with a periodic input found by using Mawhin’s continuation theorem of coincidence degree theory. Using this result, we obtain sufficient conditions for the existence of a periodic output for an output hidden feedback Elman neural network with a periodic input. Examples illustrating these sufficient conditions are given.


Introduction
Artificial neural networks are computational paradigms which implement simplified models of their biological counterparts, biological neural networks. Biological neural networks are the local assemblages of neurons and their dendritic connections that form the (human) brain. Accordingly, artificial neural networks are characterized by • local processing in artificial neurons (or processing elements); • massively parallel processing, implemented by rich connection pattern between processing elements; • the ability to acquire knowledge via learning from experience; • knowledge storage in distributed memory, the synaptic processing element connections.
Neural networks process information in a similar way the human brain does.
The network is composed of a large number of highly interconnected processing elements (neurons) working in parallel to solve a specific problem. Neural net-sufficient conditions for the existence of periodic solutions for the discrete-time counterpart of a complex-valued Hopfield neural network with time-varying delays and impulses. In [19], we proved the global exponential periodicity of a class of Hopfield neural networks with distributed delays and impulses. In our recent paper [20], we obtained sufficient conditions for the existence of a periodic output of a modified Elman neural network with a periodic input by using Mawhin's continuation theorem of coincidence degree theory [21].
In the present paper, we consider an OHF Elman neural network [4] with a periodic input. Using the result of [20], we find sufficient conditions for the existence of a periodic output of the neural network considered. Furthermore, for a subclass of these OHF Elman neural networks, we shall find the periodic output in a straightforward way using another sufficient condition. Examples illustrating these sufficient conditions are given. The calculations are done using MATLAB.

Preliminaries: Modified Elman Neural Network
Here we recall the results of our paper [20]. We consider a modified Elman neural network with r nodes in the input layer, n nodes in the hidden and context layers, respectively, and m nodes in the output layer, which adds a self-feedback factor α, 0 1 α < < , in the context nodes, based on the traditional Elman neural network [4] [10]. Its mathematical model is: Here  is the set of all positive integers, the input u is an r-dimensional vector, the output x of the hidden layer and the output C x of the context nodes are n-dimensional vectors, while the output y of the output layer is an m-dimensional vector. The weights ij a , ij b , and ij c respectively of the context nodes, input nodes and hidden nodes are entries of n n × -, n r × -and m n × Clearly, for a given input ( ) Now suppose that the input ( ) u k is N-periodic for some N ∈  , that is,  Further on, for convenience, we consider Equations (1), (2) for k ∈  , that is, where  is the set of all integers. We assume that ( ) ( ) u k N u k + = for k ∈  . Sufficient conditions for the existence of an N-periodic solution ( ) x k , k ∈  , of Equations (4), (5) are given below.
We make the following assumptions: A3. There exists a positive integer N such that A4.
where ij δ is the Kronecker delta, and assume that: A5. The matrix  is an M-matrix. Assumption A5 implies that the matrix  is nonsingular and its inverse has only nonnegative entries [22] [23].
The main result of [20] is the following theorem.
where [ ] ν is the greatest integer in the real number ν , that is, (a) Further on, let us assume that the weights ij a , , 1, 3 i j = , of the context nodes all equal 1 2 , the transfer functions i g , , of the input nodes and ij c , , of the hidden nodes are arbitrary.
Then, assumption A2 is satisfied with , assumption A4 is also satisfied since Finally, the matrix is an M-matrix with inverse V. Covachev, Z. Covacheva Since all assumptions of Theorem 1 are satisfied, the modified Elman neural network under consideration has an N-periodic output Let us assume that, moreover, , and 3 N = . Then the system of Equations (4), (5) takes the form It suffices to find the initial values Equations (15), (16) imply that  Thus, in order to satisfy Equations (18), the initial conditions must be chosen so that The system of Equations (15), (16) reduces to We have found that the initial values satisfying Equations (18) are (approximately) The first 4 values of the 3-periodic solution of Equations (15), (16) are presented in Table 1.
This solution can be found with arbitrarily high accuracy.
(b) Next, let us assume that 11 (15), (16). Then, assumption A2 is still satisfied with 1 , 1, 3 4 , assumption A4 is also satisfied since is an M-matrix with inverse Since all assumptions of Theorem 1 are satisfied, the modified Elman neural network under consideration has an N-periodic output ( ) Let us assume that, moreover, 11 3 N = . Then, the system of Equations (4), (5) takes the form ( ) ( ) ( ) It suffices to find the initial values  Table 2 for the first 4 values of the 3-periodic solution of Equations (24)- (27)).
The initial values in Example 1, (a) and (b) have been found after numerous experiments with different sets of possible initial values, using MATLAB. They can be found with an arbitrarily high accuracy after sufficiently many iterations.

Output Hidden Feedback Elman Neural Networks: Main Results
OHF Elman neural network achieves the ability to process dynamic data by adding feedback from the output layer to the hidden output context layer (second context layer) based on Elman neural network. The mathematical model of an OHF Elman neural network is [4]: where Equations (28), (29) are the same as Equations (1), (2), the transfer func- , of the output layer are as in Equation (3), is the gain factor of the self-feedback of the output layer, ij d are the connection Table 2. A 3-periodic solution of system (1), (2) in Case (b).  weights of the second context layer nodes, and C y is the output of the second context layer. An OHF Elman neural network with 2 r = , 3 n = and 4 m = is depicted in Figure 2.
Clearly, for a given input ( ) , we can find the output ( ) y k , k ∈  , from Equations (28)-(31). Now suppose that the input ( ) u k is N-periodic for some N ∈  , that is, In addition to assumptions A1-A5, we make the following assumptions: A7. There exist positive constants ˆi L , 1, (32) Journal of Software Engineering and Applications A8.
In order to formulate our main result, we introduce the m m × -matrix and assume that: A9. The matrix  is an M-matrix. Now, we can state our main result as the following theorem.
We successively obtain is nonsingular, then from Equation (44) we can determine the initial values the matrix C of the weights of the hidden layer is arbitrary.
Assumption A8 is satisfied since We give the entries of the 8 8 × matrix and evaluate the right-hand side of Equation (50)  Finally, we find the first four values of the approximate 3-periodic solution (see Table 4). We notice that Equations (43) are satisfied much more precisely than Equations (18) in Example 1. On the other hand, the application of the method becomes much more difficult for a greater number m of the nodes in the output layer.

Conclusion
We presented sufficient conditions for the existence of periodic output of modified and OHF Elman neural networks with periodic input. Examples illustrating the results obtained were given. The models considered can be applied to the services quality of experience prediction.

Conflicts of Interest
The authors declare no conflicts of interest regarding the publication of this paper.