JAMPJournal of Applied Mathematics and Physics2327-4352Scientific Research Publishing10.4236/jamp.2016.46107JAMP-67265ArticlesPhysics&Mathematics A New Conjugate Gradient Projection Method for Solving Stochastic Generalized Linear Complementarity Problems ZhiminLiu1ShouqiangDu1RuiyingWang1College of Mathematics and Statistics, Qingdao University, Qingdao, China070620160406102410312 May 2016accepted 10 June 13 June 2016© Copyright 2014 by authors and Scientific Research Publishing Inc. 2014This work is licensed under the Creative Commons Attribution International License (CC BY). http://creativecommons.org/licenses/by/4.0/

In this paper, a class of the stochastic generalized linear complementarity problems with finitely many elements is proposed for the first time. Based on the Fischer-Burmeister function, a new conjugate gradient projection method is given for solving the stochastic generalized linear complementarity problems. The global convergence of the conjugate gradient projection method is proved and the related numerical results are also reported.

Stochastic Generalized Linear Complementarity Problems Fischer-Burmeister Function Conjugate Gradient Projection Method Global Convergence
1. Introduction

Suppose is a probability space with; P is a known probability distribution. The stochastic generalized linear complementarity problems (denoted by SGLCP) is to find, such that

where and for, are random matrices and vectors. When, stochastic generalized linear complementarity problems reduce to the classic Stochastic Linear Complementarity Problems (SLCP), which has been studied in  -  . Generally, they usually apply the Expected Value (EV) method and Expected Residual Minimization (ERM) method to solve this kind of problem.

If only contains a single realization, then (1) reduces to the following standard Generalized Linear Complementarity Problem (GLCP), which is to find a vector such that

,

where and.

In this paper, we consider the following generalized stochastic linear complementarity problems. Denote, to find an such that

(2)

Let, where, ,. Then (2) is equivalent to (3) and (4)

In the following of this paper, we consider to give a new conjugate gradient projection method for solving (2). The method is based on a suitable reformulation. Base on the Fischer-Burmeister function, x is a solution of (3), where

.

Define

.

Then solving (3) is equivalent to find a global solution of the minimization problem

.

So, (3) and (4) can be rewritten as

where

,

is slack variable with,.

Let, where and. Then we know that has equations with variables.

Let and define a merit function of (5) by

.

If (2) has a solution, then solving (5) is equivalent to find a global solution of the following minimization problem

where.

2. Preliminaries

In this section, we give some Lemmas, which are taken from  -  .

Lemma 1. Let P be the projection onto Ω, let for given and, then

1), for all.

2) P is a non-expansive operator, that is, for all.

3).

Lemma 2. Let be the projected gradient of θ at.

1).

2) The mapping is lower semicontinuous on Ω, that is, if, then

.

3) The point is a stationary point of problem (6) Û.

3. The Conjugate Gradient Projection Method and Its Convergence Analysis

Given an iterate, we let,

where. Inspired by the literature  -  , we take

with.

And is defined by

Method 1. Conjugate Gradient Projection Method (CGPM)

Step 0: Let, , , , , set.

Step 1: Compute, such that

,

.

Set.

Step 2: If, stop,.

Step 3: Let, and go to Step 1.

In order to prove the global convergence of the Method 1, we give the following assumptions.

Assumptions 1

1) has a lower bound on the level set, where t1 is initial point.

2) is continuously differentiable on the L0, and its gradient is Lipschitz continuous, that is, there exists a positive constant L such that

.

Lemma 3. If tk is not the stability point of (6), , then search direction dk generated by (9) descent

direction, which is.

Proof. From (7), Lemma 1, and (8), we have

Lemma 4.  Suppose that Assumptions 1 holds. Let continuously differentiable and lower bound on the Ω, is uniformly continuous on the Ω and is bounded, then generated by Method 1 are satisfied

,.

Theorem 1. Let continuously differentiable and lower bound on the Ω, is uniformly conti-

nuous on the Ω, is a sequence generated by Method 1, then, and any accumulation

point of is a stationary point of (6).

Proof. By Lemma 2, we have, , , satisfy

for, by Lemma 1, we know that, and we have

, so,

Let, , from (11), we have

.

By the above formula, (8) and Lemma 1, we get

Taking limit on both sides and by Lemma 4, we know that

Because

and Lemma 4, we have

By (12), (13), (14) and is uniformly continuous on the Ω, we get

.

By (10), we know that

Let, where, by Lemma 2 and (15), we have

.

From Lemma 2 3), we get any accumulation point of is a stationary point of (6).

4. Numerical Results

In this section, we give the numerical results of the conjugate gradient projection method for the following given test problems, which are all given for the first time. We present different initial point t0, which indicates that Method 1 is global convergence.

Throughout the computational experiments, according to Method 1 for determining the parameters, we set the parameters as

.

The stopping criterion for the method is or.

In the table of the test results, t0 denotes initial point, denotes the solution, val denotes the final value of

, Itr denotes the number of iteration.

Example 1. Considering SGLCP with

, ,

, ,

and,.

The test results are listed in “Table 1” using different initial points.

Results of the numerical Example 1-2 using method 1
Problemt0valItr
Example 13.3 × 10−31465
3.3 × 10−31701
3.3 × 10−32670
3.3 × 10−33261
3.3 × 10−33847
3.3 × 10−34704
Example 20.729962788
0.729965528
0.729966962
0.7299100,000
0.7299100,000
0.7299100,000

Example 2. Considering SGLCP with

, ,

, ,

and,.

The test results are listed in “Table 1” using different initial points.

5. Conclusion

In this paper, we present a new conjugate gradient projection method for solving stochastic generalized linear complementarity problems. The global convergence of the method is analyzed and numerical results show that Method 1 is effective. In future work, large-scale stochastic generalized linear complementarity problems need to be studied and developed.

Acknowledgements

This work is supported by National Natural Science Foundation of China (No. 11101231, 11401331), Natural Science Foundation of Shandong (No. ZR2015AQ013) and Key Issues of Statistical Research of Shandong Province (KT15173).

Cite this paper

Zhimin Liu,Shouqiang Du,Ruiying Wang, (2016) A New Conjugate Gradient Projection Method for Solving Stochastic Generalized Linear Complementarity Problems. Journal of Applied Mathematics and Physics,04,1024-1031. doi: 10.4236/jamp.2016.46107

ReferencesChen, X. and Fukushima, M. (2005) Expected Residual Minimization Method for Stochastic Linear Complementarity Problems. Mathematics of Operations Research, 30, 1022-1038.http://www-optima.amp.i.kyoto-u.ac.jp/~fuku/papers/SLCP-MOR-rev.pdf http://dx.doi.org/10.1287/moor.1050.0160Chen, X., Zhang, C. and Fukushima, M. (2009) Robust Solution of Monotone Stochastic Linear Complementarity Problems. Mathematical Programming, 117, 51-80. http://link.springer.com/article/10.1007/s10107-007-0163-z http://dx.doi.org/10.1007/s10107-007-0163-zLin, G.H. and Fukushima, M. (2006) New Reformulations for Stochastic Nonlinear Complementarity Problems. Optimization Methods and Software, 21, 551-564.http://web.a.ebscohost.com/ehost/detail/detail?sid=beded7da-701c-4790-b1c9-81d20182cd04%40sessionmgr4005&vid=0&hid=4201&bdata=Jmxhbmc9emgtY24mc2l0ZT1laG9zdC1saXZl&preview=false#AN=22089195&db=aph http://dx.doi.org/10.1080/10556780600627610Lin, G.H., Chen, X. and Fukushima, M. (2010) New Restricted NCP Functions and Their Applications to Stochastic NCP and Stochastic MPEC. Optimization, 56, 641-653. http://www.amp.i.kyoto-u.ac.jp/tecrep/ps_file/2006/2006-011.pdf http://dx.doi.org/10.1080/02331930701617320Ling, C., Qi, L., Zhou, G. and Caccetta, L. (2008) The SC 1 Property of an Expected Residual Function Arising from Stochastic Complementarity Problems. Operations Research Letters, 36, 456-460.http://espace.library.curtin.edu.au/cgi-bin/espace.pdf?file=/2009/07/20/file_27/119233 http://dx.doi.org/10.1016/j.orl.2008.01.010Fang, H.T., Chen, X.J. and Fukushima, M. (2007) Stochastic Matrix Linear Complementarity Problems. SIAM Journal on Optimization, 18, 482-506. http://www.polyu.edu.hk/ama/staff/xjchen/SIOPT_FCF.pdf http://dx.doi.org/10.1137/050630805Gürkan, G., Ozge, A.Y. and Robinson, S.M. (1999) Sample-Path Solution of Stochastic Variational Inequalities. Mathematical Programming, 84, 313-333. http://link.springer.com/article/10.1007/s101070050024 http://dx.doi.org/10.1007/s101070050024Sun, Q.Y., Wang, C.Y. and Shi, Z.J. (2006) Global Convergence of a Modified Gradient Projection Method for Convex Constrained Problems. Acta Mathematicale Applicatae Sinica, 22, 227-242.http://link.springer.com/article/10.1007/s10255-006-0299-2 http://dx.doi.org/10.1007/s10255-006-0299-2Wang, C.Y. and Qu, B. (2002) Convergence of the Gradient Projection Method with a New Stepsize Rule. Operations Research Transactions, 6, 36-44.http://www.cnki.net/KCMS/detail/detail.aspx?QueryID=0&CurRec=4&recid=&filename=YCXX200201004&dbname=CJFD2002&dbcode=CJFQ&pr=&urlid=&yx=&v=MDM0OTdJUjhlWDFMdXhZUzdEaDFUM3FUcldNMUZyQ1VSTHlmWXVadUZ5N2xWcnpJUEM3VGRyRzRIdFBNcm85RlkSun, Q.Y., Gao, B., Jian, L. and Wang, C.Y. (2010) Modified Conjugate Gradient Projection Method for Nonlinear Constrained Optimization. Acta Mathematicae Applicatae Sinica, 33, 640-651.http://d.g.wanfangdata.com.cn/Periodical_yysxxb201004008.aspxJing, S.J. and Zhao, H.Y. (2014) Conjugate Gradient Projection Method of Constrained Optimization Problems with Wolfe Stepsize Rule. Journal of Mathematics, 34, 1193-1199. http://qikan.cqvip.com/article/detail.aspx?id=662962703&from=zk_search