^{1}

^{1}

^{*}

^{1}

In this paper, we give a smoothing neural network algorithm for absolute value equations (AVE). By using smoothing function, we reformulate the AVE as a differentiable unconstrained optimization and we establish a steep descent method to solve it. We prove the stability and the equilibrium state of the neural network to be a solution of the AVE. The numerical tests show the efficient of the proposed algorithm.

Consider the following absolute value problem [

where

The AVE has closed relation with some important problems, for example, the linear programming, Quadratic programming problem and the bimatrix game problem. The above problems can be transformed into the linear complementarity problem, and the linear complementarity problem can be transformed into the absolute value equations. Due to its simple and special structure and application value, the research on absolute value equation has drawn attention of many researchers. Mangasarian [

In this paper, we present a smooth approximation function which is based on neural network method to solve the AVE. By using a smooth approximation function of

The absolute value Equation (1) is equivalent to the nonlinear equations:

where

Definition 1.1 Smoothing approximation function, given a function

where

In this paper, we use the aggregate function [

Let

For any

So the function of absolute value

Thus the absolute value equation is transformed into the following smooth equations

where

We define the function as follows

where

of the neural network. Thus, the approximate solution of the absolute value equation is transformed to the global optimal solution of the optimization problem

Consider the following unconstrained optimization problem

the gradient can be calculated by the following formula:

where

now, we can give a neural network model for solving the absolute value equation, which is based on the steepest descent neural network model for (4).

where

Next, we recall some materials about first order differential equations (ODE) [

where

Definition 3.1 A point

if

then

Lemma 3.1 Assume that

Definition 3.2 (Asymptotic stability). An isolated equilibrium point

in addition to being Lyapunov stable, it has the property that

Definition 3.3 (Lyapunov stability). Stability in the sense of Lyapunov Let

such that

Definition 3.4 (Lyapunov function). Let

Lemma 3.2 a) An isolated equilibrium point

b) An isolated equilibrium point

neighborhood

Lemma 3.3 [

Theorem 3.1

Proof. Let

1) The function

2) Since

3) If

So, by the Definition 3.4 we know that

Theorem 3.2 Each solution of the absolute value equation is the equilibrium point of the neural network (5).

Conversely, if

Proof. Assume that

Obviously, we got

hand if

Next, we can prove that

Theorem 3.3. Let the

Proof. Since

uation is known by the Theorem 3.2. Therefore,

point, so

Theorem 3.1 we know that

Consequently, we have

In this section we give some smooth of numerical tests of neural network algorithm, due to the complementarity problem can be transformed to absolute value equations, we consider the linear complementarity problem which is equivalent to the absolute value equations as test cases.

For a given matrix

From the Theorem 2 in the literature [

where

Example 1 [

Since1 is not included in the eigenvalues of

where

We can find that

By using the neural network model, the initial point is generated by x0 = rand (n,1), and the program is performed under the environment of MATLAB7.11.0. The following two figures (

Since

Example 2 [

Through calculation, we can get one eigenvalue of

Set

And the absolute value equation is

Thus, we can get one solution of the absolute value equation whcih is

Since

Example 3. Consider the following linear complementary problem:

Through calculation we can get one eigenvalue of

And the absolute value equation is:

Thus, we can get the solution of the absolute value equation which is

Since^{*} = (2.5 2.5 0 2.5).

This paper adopted the aggregate function method to tackle the absolute value equation with smooth processing, and then turned the absolute value equation into a differentiable unconstrained optimization problem. In order to obtain the approximate solution of the original problem we use the proposed neural network model to solve the

unconstrained optimization problem. At the same time, we propose one neural network which is based on different energy function. Through the transformation between linear complementary problem and absolute value equation, it can be used to solve the linear complementary problem, too. For the traditional energy function based on the NCP function, we can avoid a lot of matrix calculation. Numerical examples show that the algorithm is very effective for solving this kind of absolute value equation, and the accuracy of solution can be controlled by the parameters completely. In view of the fact that it is relatively difficult to solve the absolute value equation, the proposed method in this paper can be used to solve the absolute value problem effectively.

This work is supported by National Natural Science Foundation of China (No.11171221) and Innovation Program of Shanghai Municipal Education Commission (No.14YZ094).

FeiranWang,ZhenshengYu,ChangGao, (2015) A Smoothing Neural Network Algorithm for Absolute Value Equations. Engineering,07,567-576. doi: 10.4236/eng.2015.79052