TITLE:
Some Features of Neural Networks as Nonlinearly Parameterized Models of Unknown Systems Using an Online Learning Algorithm
AUTHORS:
Leonid S. Zhiteckii, Valerii N. Azarskov, Sergey A. Nikolaienko, Klaudia Yu. Solovchuk
KEYWORDS:
Neural Network, Nonlinear Model, Online Learning Algorithm, Lyapunov Func-tion, Probabilistic Convergence
JOURNAL NAME:
Journal of Applied Mathematics and Physics,
Vol.6 No.1,
January
29,
2018
ABSTRACT: This paper deals with deriving the properties of
updated neural network model that is exploited to identify an unknown nonlinear
system via the standard gradient learning algorithm. The convergence of this
algorithm for online training the three-layer neural networks in stochastic
environment is studied. A special case where an unknown nonlinearity can
exactly be approximated by some neural network with a nonlinear activation
function for its output layer is considered. To analyze the asymptotic behavior
of the learning processes, the so-called Lyapunov-like approach is utilized. As
the Lyapunov function, the expected value of the square of approximation error
depending on network parameters is chosen. Within this approach, sufficient
conditions guaranteeing the convergence of learning algorithm with probability
1 are derived. Simulation results are presented to support the theoretical
analysis.