Hybrid Support Vector Regression with Parallel Co-Evolution Algorithm Based on GA and PSO for Forecasting Monthly Rainfall ()

Jiansheng Wu^{}, Yongsheng Xie^{}

Department of Mathematics and Computer, Guangxi Science & Technology Normal University, Laibin, Guangxi, China.

**DOI: **10.4236/jsea.2019.1212032
PDF
HTML XML
537
Downloads
1,434
Views
Citations

Department of Mathematics and Computer, Guangxi Science & Technology Normal University, Laibin, Guangxi, China.

Accurate and timely monthly rainfall forecasting is a major challenge for the scientific community in hydrological research such as river management project and design of flood warning systems. Support Vector Regression (SVR) is a very useful precipitation prediction model. In this paper, a novel parallel co-evolution algorithm is presented to determine the appropriate parameters of the SVR in rainfall prediction based on parallel co-evolution by hybrid Genetic Algorithm and Particle Swarm Optimization algorithm, namely SVRGAPSO, for monthly rainfall prediction. The framework of the parallel co-evolutionary algorithm is to iterate two GA and PSO populations simultaneously, which is a mechanism for information exchange between GA and PSO populations to overcome premature local optimum. Our methodology adopts a hybrid PSO and GA for the optimal parameters of SVR by parallel co-evolving. The proposed technique is applied over rainfall forecasting to test its generalization capability as well as to make comparative evaluations with the several competing techniques, such as the other alternative methods, namely SVRPSO (SVR with PSO), SVRGA (SVR with GA), and SVR model. The empirical results indicate that the SVRGAPSO results have a superior generalization capability with the lowest prediction error values in rainfall forecasting. The SVRGAPSO can significantly improve the rainfall forecasting accuracy. Therefore, the SVRGAPSO model is a promising alternative for rainfall forecasting.

Keywords

Genetic Algorithm, Particle Swarm Optimization, Rainfall Forecasting, Parallel Co-Evolution

Share and Cite:

Wu, J. and Xie, Y. (2019) Hybrid Support Vector Regression with Parallel Co-Evolution Algorithm Based on GA and PSO for Forecasting Monthly Rainfall. *Journal of Software Engineering and Applications*, **12**, 524-539. doi: 10.4236/jsea.2019.1212032.

1. Introduction

Monthly rainfall time series exhibit non-stationary characteristic, which can be described as time series whose statist distributions change over time. The structural changes of monthly rainfall may be caused by the various processes of atmospheric physical change, such as atmospheric physics, temperature physics, pressure field and sea temperature field, etc. So accurate and timely monthly rainfall forecasting is one of the most difficult processes of the hydrology cycle for both water quantity and quality management [1] [2] [3]. Several recent research studies have developed for monthly rainfall forecasting based on atmospheric physics model, however, renders quantitative forecasting of rainfall extremely difficult because it involves many nonlinear variables which are interconnected in a very complicated way, and the volume of rainfall calculation [4] [5] [6]. The support vector machine (SVM) developed by Vapnik and his colleagues, is an important machine learning tool based on statistical learning theory, using the principle of structural risk minimization. With the introduction of Vapnik’s insensitivity loss function, the regression model of SVM, called support vector regression (SVR), has also been receiving increasing attention to solve nonlinear estimation problems [7] [8]. Because SVR is a specific type of learning algorithms, characterized by the capacity control of the decision function, the use of the kernel function and the sparsity of the solution, SVR has used on regression estimation, include monthly rainfall forecasting modelling. These unique characteristics of SVR make them a promising alternative approach to traditional regression estimation approaches.

Although SVRs have been recently proposed as a new technique for machine learning problems, the literature about SVRs is vast and growing. When using SVR in regression estimation, many important questions research remain, such as, how to choose the optimal parameters of SVR. Optimal parameters of the kernel function can lead to the accuracy of the SVR regression estimation. Inappropriate parameters in SVR lead to over-fitting or under-fitting in the SVR regression estimate for application of actual precipitation prediction. Support vector machine hyper-parameters are obtained through trial-and-error by the operators, which leads to the effects of SVR applications strongly depends upon the operator’s experience [7] [8] [9]. If the user is not careful, it is easy to cause model over-fitting. Such a model might be doing well in predicting past incidents, but unable to predict future events [10] [11]. Most studies depend on the cross-validation set to tune the parameters of the kernel function. So, it is very worthwhile to develop the method selection problem to make SVR less dependent on the skills of the experimenters.

Recently, several studies have proposed the parameter optimization of Gaussian kernel function by evolutionary optimization, such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) [7] [8] [9] [10] [11], achieve good application results. In this paper, a novel Co-evolution algorithm has been presented to develop an efficient training algorithm for the parameter of SVR kernel function based on the standard GA and imports optimized PSO algorithm. In order to overcome the shortcoming of the standard PSO algorithm and GA, which are easily to fall into local solution and low optimization. In this paper we use co-evolution of “GA-PSO”, taking GA and PSO to iterate each other, which the two populations can co-evolve, in order to search out high quality of association rules in the high-dimensional data set. This investigation presented in this paper is motivated by a desire to improve the inefficient disadvantages of searching algorithms mentioned above in determining the parameters in the SVR model by the parallel co-evolution based on GA and PSO for monthly rainfall forecasting modelling.

The present study proposed a novel parallel co-evolution algorithm of GA combined with PSO to optimize the SVR parameters, namely SVRGAPSO based on the mechanism of information interaction between GA and PSO when they are iterating over two populations. Our approach determines the optimal kernel parameter values for the SVR model in monthly rainfall forecasting. The rainfall data of Nannig, Guangxi, China, is predicted as a case study for our proposed method. An actual case of forecasting rainfall is illustrated to show the improvement in predictive accuracy and capability of generalization achieved by our proposed SVRGAPSO model. These are many monthly rainfall forecasting models of application different approaches; SVRGAPSO model achieves better generalization performance than other regression estimation approaches. The rest of this study is organized as follows. Section 2 describes the SVRGAPSO, ideas and procedures. For further illustration, different models are used to employ for rainfall forecasting analysis in Section 3, and conclusions are drawn in the final section.

2. SVRGAPSO Methodology

2.1. Support Vector Regression

The brief ideas of SVR for the case of regression are introduced. Suppose we are given training data ${\left({x}_{i},{y}_{i}\right)}_{i=1}^{N}$, where ${x}_{i}$ is the input vector; ${y}_{i}$ is the output value and N is the total number of data dimension [12]. The linear regression function is formulated as follows:

$f\left(x\right)={\omega}^{\text{T}}\varnothing \left(x\right)+b$ (1)

where x denotes the forecasting values; $\varnothing \left(x\right)$ denotes the high dimensional feature space, which is non-linearly mapped from the input space x; $\omega $ is the coefficients and b are adjustable. The coefficients $\omega $ and b can be estimated by minimizing the regularized risk function:

${\mathrm{min}}_{\omega ,b,{\xi}^{*},\xi}{R}_{\epsilon}^{b}\left(\omega ,{\xi}^{*},\xi \right)=\frac{1}{2}{\omega}^{\text{T}}\omega +C{\displaystyle {\sum}_{i=1}^{N}\left({\xi}^{*}+\xi \right)}$ (2)

Subject to

$\{\begin{array}{l}{y}_{i}-\omega \cdot \varnothing \left({x}_{i}\right)-b\le \epsilon +{\xi}_{i}^{*}\\ -{y}_{i}+\omega \cdot \varnothing \left({x}_{i}\right)+b\le \epsilon +{\xi}_{i}\\ {\xi}_{i}^{*}\ge 0\\ {\xi}_{i}\ge 0\\ i=1,2,3,\cdots ,N\end{array}$ (3)

Therefore, the objective of SVR is to include training patterns inside an $\epsilon $ -insensitive tube while keeping the norm $\Vert \omega \Vert $ as small as possible. The parameter $\epsilon $ is the difference between actual values and values calculated from the regression function. This difference can be viewed as a tube around the regression function. C denotes a cost function measuring empirical risk; it indicates a parameter determining the trade-off between the empirical risk and the model flatness. After the quadratic optimization problem with inequality constraints is solved, the SVR is given by:

$f\left(x,{\alpha}_{i},{\alpha}_{i}^{*}\right)={\displaystyle {\sum}_{i=1}^{N}\left({\alpha}_{i}-{\alpha}_{i}^{*}\right)K\left(x,{x}_{i}\right)}+b$ (4)

where ${\alpha}_{i}$ and ${\alpha}_{i}^{*}$ are the Lagrangian multipliers associated with the constraints, $K\left(x,{x}_{i}\right)$ is called the kernel function. As the kernel function defines the feature space in which the decision function is constructed, exploring useful kernel function constitutes a significant topic in SVR application. The most used kernel functions are the Gaussian radial basis functions (RBF) with of the parament $\sigma $ :

$K\left({x}_{i},{x}_{j}\right)=\mathrm{exp}\left(\frac{{\Vert {x}_{i}-{x}_{j}\Vert}^{2}}{2{\sigma}^{2}}\right)$ (5)

By using the kernel functions, SVR can efficiently and effectively construct many types of nonlinear functions to compute the dot product in feature space for regression estimation. Gaussian RBF kernel is not only easier to implement but also capable of non-linearly mapping the training data into an infinite dimensional space. Thus, it is suitable to deal with a nonlinear relationship. Therefore, the Gaussian RBF kernel function is specified in this study.

SVR based on radial basis kernel function has three parameters to be determined, where C is to trade-off between the model flatness and the degree of the training errors, ε is the width of the insensitive loss function, and σ is the bandwidth of the Gaussian kernel function. For example, if C is too large (infinity), then the objective is to minimize the empirical risk only. Parameter ε controls the width of the ε-insensitive zone, i.e., the number of support vectors (SVs) employed in the regression. Larger ε value implies fewer SVs employed; thus, the regression function is simpler [8] [13] [14] [15]. It is well known that the forecasting accuracy of the SVR model depends on a good setting of hyper parameters, C, ε and the kernel parameter σ. Thus, the determination of all three parameters selection is further an important issue. Those parameters are often difficult to be determined directly due to conceptual constraints. Proper types of SVR kernel function and the parameters of SVR kernel function can improve the SVR regression accuracy. Inappropriate parameters in SVR lead to over-fitting or under-fitting [14] [16].

Afore mentioned there is no structural method or any shortage opinions on efficient setting of SVR parameters. Recently, the authors applied a series of searching algorithms to test the potentiality and the suitability involved in the parameter’s selection of an SVR model. However, as mentioned above, the employed evolutionary algorithms almost lack knowledge memory functions, which makes it time-consuming and has a premature convergence to a local optimum in searching for the suitable parameters of an SVR model. Therefore, the GAPSO algorithm is used in the proposed SVR model to optimize the parameter selection.

2.2. Genetic Algorithm and Particle Swarm Optimization

Genetic algorithm is an adaptive optimization technique developed by Holland based on natural evolution and survival of the fittest, and works on a population of individuals [17]. GA has been successfully applied to solve in many optimization problems of scientific and engineering fields, due to the versatility and robustness in solving optimization problems. However, there are two major shortcomings on GA, slow convergence and trapped into local optimum, which are mainly caused by the population diversity reduction [18] [19].

PSO has been used to solve real time issues and aroused researchers’ interest due to its flexibility and efficiency, which is a stochastic, population-based optimization algorithm introduced by James Kennedy and Russell C. Eberhart [20], has gained much attention and wide applications in solving continuous nonlinear optimization problems. PSO has many advantages, such as easy exchange of information, storage of information, simple structure, quick convergence and easy implementation by all particles, nowadays PSO has gained much attention and wide applications in solving continuous nonlinear optimization problems. However, the PSO algorithm greatly depends on its initial values, and the swarm diversity is dropped rapidly along with the increasing of the iteration times which makes it been trapped in the local optimum, i.e., premature convergence, accordingly, the global search capacity has also been affected. Particularly, as for the high-dimensional multi-modal problems, premature convergence may be appeared easily [21] [22] [23].

2.3. Parallel Co-Evolution Algorithm

Co-evolution concept is first proposed by Ehrlich and Raven who discuss the evolution between plants and herbivorous insects [24] [25] [26]. Its core idea is: the interaction of populations is indispensable conditions for survival of each other. In a long-term evolutionary process, they are interdependent and coordinate. They improve the individual and whole performance. Co-evolution algorithm adopts populations to change the mode of traditional searching optimum solutions, which could avoid the defects of dimension reduction method, the local optimum and the premature convergence. In this paper, a novel parallel co-evolution is presented for the parameters optimization problem of GA combined with PSO, taking GA and PSO to iterate each other. Combined with co-evolution concept, the two populations can co-evolve, in order to search out high quality of association rules in the high-dimensional data set. In order to achieve this idea, this paper designs an information exchange mechanism, named interoperability. Let information pass between the two populations to achieve the purpose of co-evolution.

2.4. The Developed SVRGAPSO Approach

In this paper, real value chromosome of GAPSO directly handles the parameters themselves and much computation time is saved. The chromosome is comprised of two parts: SVR Parameter and Kernel Parameters. Real value chromosome $\left\{{C}_{i},{\epsilon}_{i}\right\}$ represents the valued of the penalty parameter and insensitive loss function, respectively. Real value chromosome $\left\{{\sigma}_{i}\right\}$ represents the valued of kernel parameter. A fitness function assessing the performance for everyone must be designed before searching for the optimal values of the SVR parameters. The performance of the parameter set is measured by the mean absolute percentage error (MAPE) on the last subset. Averaging the MAPE over the N trials gives an estimate of the expected generalization error for training sets given by Equation (6)

${F}_{fitness}\left({x}_{1},{x}_{2},\cdots ,{x}_{N}\right)=\frac{1}{\left[1+\frac{1}{N}{{\displaystyle \sum}}_{i=1}^{N}\frac{\left|{y}_{i}-{\stackrel{^}{y}}_{i}\right|}{{y}_{i}}\ast 100\%\right]}$ (6)

where, ${x}_{i}$ is the training samples, N is the number of training data samples, ${y}_{i}$ is the actual value, and ${\stackrel{^}{y}}_{i}$ the predicted value. The optimal parameter setting is critical to predicting the performance of SVR model. In this paper, Parallel Co-evolutionary algorithm based on GA combined with PSO is employed to simultaneously optimize SVR’s parameters and the kernel function’s parameter, namely SVRGAPSO. Figure 1 illustrates the process of the SVRGAPSO algorithm for SVR optimization in rainfall modelling. Details of our proposed SVRGAPSO described as follows:

Step1: Generate initial population. Two initial populations are randomly generated according to the target database. POP_{1} and POP_{2} use respectively the search strategy of PSO and GA to search for association rules. Two populations use the same coding rules, the fitness function, population size and the maximum evolution generation. This paper used real coding rules, in which the number of elements in an array of real numbers corresponds to transaction database field. The number of element values represents the attribute values of the field.

Step2: Initialize the two populations with GA and PSO parameters: number of iterations, crossover probability, mutation probability, particles velocity and particles position.

Step3: Input training data and calculate the fitness, which determine G_{best} and P_{best} by a simple comparison of their fitness values according to Equation (6). We compare fitness value of the global best individual Gpso in POP_{1} and best individual Gga in POP_{2}. Individuals with larger fitness values will replace the best individual of other populations, as a basis for the next generation of evolution. The adjustment strategies of crossover probability are shown in Equation (7):

Figure 1. Flowchart of the GAPSO optimization SVR for rainfall modelling.

${P}_{c}\left({x}_{1},{x}_{2},\cdots ,{x}_{N}\right)=\{\begin{array}{l}\frac{{P}_{c\mathrm{max}}-{P}_{c\mathrm{min}}}{1+\mathrm{exp}\left(2({f}^{\prime}-\stackrel{\xaf}{f}/{f}_{\mathrm{max}}-\stackrel{\xaf}{f}\right)},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{f}^{\prime}\ge \stackrel{\xaf}{f}\\ {P}_{c\mathrm{max}},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.05em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}{f}^{\prime}<\stackrel{\xaf}{f}\end{array}$ (7)

where $\left\{{P}_{c\mathrm{max}}=0.9\right\}$ and $\left\{{P}_{c\mathrm{min}}=0.3\right\}$ respectively denote the upper and lower limits of crossover probability $\left\{{P}_{c}\right\}$. $\left\{{f}_{\mathrm{max}}\right\}$ is the maximum fitness value of individuals in the current population, $\left\{\stackrel{\xaf}{f}\right\}$ is the average fitness value of the current population, and $\left\{{f}^{\prime}\right\}$ is the larger fitness value of two cross-individual. In this paper, the mutation probability is related to iterations number. The adjustment strategies of mutation probability are shown in Equation (8):

${P}_{m}=\{\begin{array}{l}{P}_{m\mathrm{min}}+\frac{t}{{T}_{\mathrm{max}}},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}0\le \frac{t}{{T}_{\mathrm{max}}}\le \left({P}_{m\mathrm{max}}-{P}_{m\mathrm{min}}\right)\\ \frac{{P}_{m\mathrm{max}}}{{P}_{m\mathrm{max}}-{P}_{m\mathrm{min}}-1}\ast \frac{t}{{T}_{\mathrm{max}}}+\frac{{P}_{m\mathrm{max}}}{{P}_{m\mathrm{min}}-{P}_{m\mathrm{min}}+1},\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\left({P}_{m\mathrm{max}}-{P}_{m\mathrm{min}}\right)\le \frac{t}{{T}_{\mathrm{max}}}\le 1\end{array}$ (8)

where $\left\{{P}_{m\mathrm{max}}=0.1\right\}$ and $\left\{{P}_{m\mathrm{min}}=0.001\right\}$ are the upper and lower limits of $\left\{{P}_{m}\right\}$, $\left\{{T}_{\mathrm{max}}\right\}$ is the maximum number of iterations, $\left\{t\right\}$ is the current number of iterations. In this paper, standard PSO algorithm is used to optimization operation for aims at continuous function to search operations. See literature [18] to details.

Step4: We judge condition whether to meet the termination condition. If the number of iterations has reached the maximum number of iterations then the algorithm ends, switch to Step 5; or continue to the next step.

Step5: The speed and location of POP_{1} are updated in accordance with PSO and GA then produce next generation. Once the termination condition is met, it will output the best solution and obtain the optimal parameter setting for SVR model. Input test samples for the prediction effect of the SVR model.

3. Application and Experiments Analysis

The platform adopted to develop the SVRGAPSO approach is a PC with the following features: Intel Core i7-8550U, 1.80 GHz CPU, 32.0GB RAM, Windows 10 operating system and the MATLAB R2019a development environment. In this paper, GA and PSO parameters are set as follows: the iteration times are 100; the population is 40; crossover probability is 0.80; mutation probability is 0.05; the minimum inertia weight is 0.1; the maximum inertia weight is 0.9 and the learning rate is 2.0.

3.1. Empirical Data

Real-time ground monthly rainfall data have been obtained from January 1952 to December 2017 form Guangxi Meteorological Bureau in Nanning of Guangxi, China. The data set contained 792 data points, whose training data set contained 480 (1952-1991) data for modeling, validation set is 240 (1992-2011) for validation model, and the remaining 72 (2012-2017) data are used to test the predictive effect of the. Figure 2 shows the actual rainfall from January 1952 to December 2017 in the past 66 years of Nanning, Guangxi, China. It can be seen from Figure 2 that the annual precipitation in flood season is relatively large, which the annual flood season is from April to September with an average precipitation of over 300 mm.

3.2. Independent Variables of the Monthly Rainfall Model

It is very important to select of independent variables for rainfall forecasting model. In this paper, the most commonly variable selection method in meteorological operation to select predictive independent factors is introduced. First of all, the candidate forecasting factors are selected from the numerical forecast products based on 96 h forecast field, which includes: the 17 conventional meteorological elements and physical elements from the T213 numerical products of China Meteorological Administration, the data cover the latitude from 150N to 300N, and longitude from 1000E to 1200E, with 10 × 10 resolution, altogether there are 336 grid points. We can get 76 variables as the main forecasting factors.

Figure 2. The actual rainfall from January 1952 to December 2017 in Nanning, Guangxi, China.

This paper used the principal component analysis to obtain 12 variables as SVR’s input. The original meteorological data is used as real output.

3.3. Criteria for Evaluating Model Performance

This paper used the following evaluation metric to measure the performance of the proposed model: Root mean square error (RMSE), Mean absolute percentage error (MAPE), Coefficient of efficiency (CE), which can be found in many paper [8]. For the purpose of comparison by the same 12 input variables, we have also built other three-monthly rainfall forecasting models: pure SVR model, SVR with pure PSO evolutionary SVR parameters (named by SVRPSO), SVR with pure GA evolutionary SVR (named by SVRGA). For building SVR rainfall forecasting model, the LIBSVM package proposed by Chang and Lin is adapted for this paper [27], which all SVR parameters are based on the RBF kernel type by the trial-and-error method. The best parameters with the minimum testing RMSE are optimal. The optimal parameters are based on the best testing and validation result (minimum RMSE). The optimal parameters are $\left(C=53.6910,\epsilon =0.06147,\sigma =0.2983\right)$ based on the best testing and validation result (minimum RMSE).

For building SVRPSO and SVRGA rainfall forecasting model, PSO is used to search for the optimal parameter values of SVR for rainfall forecasting by Chen K., et al. presented [28], and GA is used to search for optimal parameter values of SVR for rainfall forecasting by Li W. M., et al. presented [15]. Monthly precipitation forecasting model is established by evolutionary selection of optimal parameters of SVR based on the best testing and validation result (minimum RMSE) with GA and PSO respectively. These results are compared with the results of Co-evolutionary SVR SVRPSO to illustrate the performance of the different evolutionary algorithms.

3.4. Results Analysis

Figure 3 shows the change process of fitness function value in the evolution stage of SVRGAPSO, SVRPSO and SVRGA with the increase in the number of evolutionary iterations. It can be seen from Figure 3, the performance of parallel co-evolutionary algorithm was significantly better than the GA and PSO on convergence speed and population stability. With the increasing number of iterations, the GA has been caught the premature convergence and could not get out after 50 iterations. Compared with the GA, the individual quality of PSO has been improved, but also faces the dilemma that cannot jump out of local optimal solution. In the iterative process, parallel co-evolutionary algorithm also has the phenomenon of premature convergence, however, there is a clear inflection point at 60 iterations, showing GAPSO can avoid problems of premature convergence, escape from local optima, and tends to be stable gradually. These results show that the advantages of parallel co-evolutionary algorithm can guide individual which has been plunged into the local optimum value to deviate from the original local minima.

Figure 4 shows a graphical representation of the validation data results using

Figure 3. The tendency of fitness value in the evolution stage of SVRGAPSO, SVRPSO and SVRGA.

Figure 4. Comparison of actual and validation for rainfall models by different models.

four different models in Nanning, Guangxi from January 1992 to December 2011. Figure 5 shows a graphical representation of the testing data results of precipitation using four different models in Nanning, Guangxi from January 2012 to December 2017. Table 1 shows the training, validation and testing performance of four different models in monthly rainfall via the same input factors. These results indicate that the deviations of SVRGAPSO model between the calculated values and the actual values are the smallest, whether fitting data, validation data or testing data. The performance of SVRGAPSO model is superior to the three other models.

From the graphs and tables, we can generally see that The SVRGAPSO algorithm enables the solution to jump out of local optima, and decreases the vibration near the end of locating a solution by information exchange between GA and PSO populations, and the forecasting results are very promising for monthly rainfall under study either where the measurement of forecasting performance is goodness of fit such as RMSE (refer to Table 1) or where the forecasting performance criterion is MAPE (refer to Table 1), showing the forecasting performance of different models from different perspectives.

Clearly, the RMSE is the only criterion to measure the accuracy of prediction. That is, accuracy in goodness-of-fit is only one of the most important criteria models and MAPE is the criterion to measure the relative performance of model for monthly rainfall forecasting. The training, validation and forecasting performance comparisons of various models for the rainfall via RMSE, MAPE, and CE are reported in Table 1, respectively. As shown in Table 1, for the training data, the RMSE of SVR is 26.324, SVRGA’s RMSE is 22.587, SVRPSO’s RMSE is 21.110; while of the RMSE of SVRGAPSO, RMSE reaches 3.755, we can see that

Figure 5. Comparison of actual and testing for rainfall models by different models.

Table 1. Table captions should be placed above the table.

the RMSE of SVRGAPSO is the smallest in all models. Focusing on the RMSE indicator in testing samples, our proposed SVR based on parallel co-evolutionary algorithm technique performs the best in all the cases, followed by SVRPSO technique and SVR technique; SVRGA is the worst from a general point of view.

Similarly, for the validation data, the RMSE of the SVR is 28.677, SVRGA’s RMSE is 23.795, SVRPSO’s RMSE is 21.580; while for the SVRGAPSO, RMSE reaches 3.782, SVRGAPSO’s RMSE is the smallest. In the testing sample results, we can see the same conclusion that the squared error sum of SVRGAPSO is also the smallest in all models. The main reason is that the GA is trapped in the local optimal solution and cannot find the optimal parameters of SVR.

Focusing on the MAPE indicator of the training case, validation and testing data, the SVRGAPSO model is also less than the SVR, SVRGA and PSO-SVR models, which the deviations of SVRGAPSO model between observed and forecasting value are the smallest. However, the low RMSE does not necessarily mean that there is a high hit rate of forecasting direction for monthly rainfall movement direction prediction. Thus, the CE comparison is necessary. CE indicators are more important than other RMSEs and MAPEs, because CE is an indicator of the trend of the model, and is mainly used to judge whether the trend of the forecast results is consistent with the actual precipitation trend. Similarly, the SVRGA model is the maximum in their three models in all stages. These results show the SVRGAPSO model have higher correlation relationship with observed rainfall values, it also implies that the SVRGAPSO model is capably to capture the average change tendency of monthly rainfall data. To summarize, the SVRGAPSO model is superior to the other three models presented here in terms of RMSE, MAPE and CE for rainfall prediction under the same input.

The main reason is that GA and PSO are easy to fall into local optimum and cannot evolve to optimal parameters. SVR are also prone to over-fitting through cross-validation, resulting in poor prediction results. In the iterative process of GAPSO using GA and PSO to exchange of information between the two populations, co-evolution algorithm not only is superior in the mining quality, but also has a significant advantage in the ability to jump out of local optimal solution also has the phenomenon of premature convergence. We get the global optimum with greater probability for SVR parameters.

From the experiments presented in this paper we can draw the following conclusions. The experimental results show that the GAPSOSVR monthly rainfall forecasting model is superior to the pure SVR model, the GASVR model as well as the PSOSVR models for the training, validation and test cases of monthly rainfall in terms of the measurement of RMSE, MAPE and CE, as can be seen from Table 1. Likewise, the GAPSOSVR model also outperforms other three models in terms of RMSE, as can be seen from Figures 3-5 and Table 1. The co-evolution algorithm based on GAPSO forecasts can improve forecasting accuracy significantly of SVR, in other words, the performance of the GAPSOSVR forecasting model is better than those of other forecasting models in terms of RMSE, MAPE and CE. This leads to the third conclusion. The co-evolution algorithm can be used as an alternative tool to obtain the global optimum, get greater forecasting accuracy and improve the prediction quality further in view of empirical results for SVR.

4. Conclusion

The rainfall system is one of the most active dynamic weather systems. This paper presents a parallel co-evolution algorithm using GA and PSO to exchange each information between the two populations in the process of evolutionary iteration for the parameters of SVR in rainfall forecasting modelling. In terms of empirical results, we find that across different models for the test cases of monthly rainfall based on different evaluation criteria, our proposed SVRGAPSO forecasting technique performs the best. In all testing cases, RMSE of the proposed our modeling technique is the lowest and the CE is the highest, indicating that the SVRGAPSO forecasting technique can be used as a viable solution to monthly rainfall time series forecasting.

Acknowledgements

The authors would like to express their sincere thanks to the editor and anonymous reviewers’ comments and suggestions for the improvement of this paper. This work was supported in part by the Natural Science Foundation of China under Grant No.41575051, 41565005, and by the Science and Technology Foundation of Guangxi Province under Grant No. AD16450003, and No. 2018AB14003, and by the Guangxi Education Department under Grant 2019KY0863, 2017KY0896, KY2016YB554, and by the Key Disciplines for Operational Research and Cybernetics of the Education Department of Guangxi Province.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

[1] |
Talei, A., Chua, L.H., Quek, C. (2010) A Novel Application of a Neuro-Fuzzy Computational Technique in Event-Based Rainfall-Runoff Modeling. Expert Systems with Applications, 37, 7456-7468. https://doi.org/10.1016/j.eswa.2010.04.015 |

[2] |
Esteves, J.T., De Souza Rolim, G. and Ferraudo, A.S. (2019) Rainfall Prediction Methodology with Binary Multi-Layer Perceptron Neural Networks. Climate Dynamics, 52, 2319-2331. https://doi.org/10.1007/s00382-018-4252-x |

[3] |
Unnikrishnan, P. and Jothiprakash, V. (2018) Daily Rainfall Forecasting for One Year in a Single Run Using Singular Spectrum Analysis. Journal of Hydrology, 561, 609-621. https://doi.org/10.1016/j.jhydrol.2018.04.032 |

[4] |
Wu, C.L. and Chau, K.W. (2011) Rainfall-Runoff Modelling Using Artificial Neural Network Coupled with Singular Spectrum Analysis. Journal of Hydrology, 399, 394-409. https://doi.org/10.1016/j.jhydrol.2011.01.017 |

[5] |
Yaseen, Z.M., Fu, M., Wang, C., et al. (2018) Application of the Hybrid Artificial Neural Network Coupled with Rolling Mechanism and Grey Model Algorithms for Streamflow Forecasting over Multiple Time Horizons. Water Resources Management, 32, 1883-1899. https://doi.org/10.1007/s11269-018-1909-5 |

[6] |
Wu, J. (2018) Co-Evolution Algorithm for Parameter Optimization of RBF Neural Networks for Rainfall-Runoff Forecasting. In: Huang, D.S., Bevilacqua, V., Premaratne, P. and Gupta, P., Eds., Intelligent Computing Theories and Application. ICIC 2018. Lecture Notes in Computer Science, Springer, Cham, 195-206.
https://doi.org/10.1007/978-3-319-95930-6_19 |

[7] |
Wu, J., Jin, L. and Liu, M. (2010) A Hybrid Support Vector Regression Approach for Rainfall Forecasting Using Particle Swarm Optimization and Projection Pursuit Technology. International Journal of Computational Intelligence and Applications, 9, 87-104. https://doi.org/10.1142/S1469026810002793 |

[8] |
Wu, J. (2017) A Novel HPSOSA for Kernel Function Type and Parameter Optimization of SVR in Rainfall Forecasting. In: Tan, Y., Takagi, H., Shi, Y. and Niu, B., Eds., Advances in Swarm Intelligence. Lecture Notes in Computer Science, Springer, Cham, 359-370. https://doi.org/10.1007/978-3-319-61833-3_38 |

[9] |
Huang, C.L. and Dun, J.F. (2008) A Distributed PSO-SVM Hybrid System with Feature Selection and Parameter Optimization. Applied Soft Computing, 8, 1381-1391.
https://doi.org/10.1016/j.asoc.2007.10.007 |

[10] |
Wu, C.H., Tzeng, G.H. and Lin, R.H. (2009) A Novel Hybrid Genetic Algorithm for Kernel Function and Parameter Optimization in Support Vector Regression. Expert Systems with Applications, 36, 4725-4735.
https://doi.org/10.1016/j.eswa.2008.06.046 |

[11] |
Lin, S.W., Ying, K.C. and Chen, S.C. (2008) Particle Swarm Optimization for Parameter Determination and Feature Selection of Support Vector Machines. Expert Systems with Applications, 35, 1817-1824.
https://doi.org/10.1016/j.eswa.2007.08.088 |

[12] |
Vapnik, V. (1995) The Nature of Statistical Learning Theory. Springer, New York.
https://doi.org/10.1007/978-1-4757-2440-0 |

[13] |
Wu, C.H. (2009) Chaotic Particle Swarm Optimization Algorithm in a Support Vector Regression Electric Load Forecasting Model. Energy Conversion and Management, 50, 105-117. https://doi.org/10.1016/j.enconman.2008.08.031 |

[14] |
Sedighi, F., Vafakhah, M. and Javadi, M.R. (2016) Rainfall-Runoff Modeling Using Support Vector Machine in Snow-Affected Watershed. Arabian Journal for Science and Engineering, 41, 4065-4076. https://doi.org/10.1007/s13369-016-2095-5 |

[15] |
Li, M.W., Hong, W.C. and Kang, H.G. (2013) Urban Traffic Flow Forecasting Using Gauss-SVR with Cat Mapping, Cloud Model and PSO Hybrid Algorithm. Neurocomputing, 99, 230-240. https://doi.org/10.1016/j.neucom.2012.08.002 |

[16] |
Ju, F.Y. and Hong, W.C. (2013) Application of Seasonal SVR with Chaotic Gravitational Search Algorithm in Electricity Forecasting. Applied Mathematical Modelling, 37, 9643-9651. https://doi.org/10.1016/j.apm.2013.05.016 |

[17] | Holland, J.H. (1975) Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI. |

[18] |
Wu, J. (2016) Hybrid Optimization Algorithm to Combine Neural Network for Rainfall-Runoff Modeling. International Journal of Computational Intelligence and Applications, 15, 1-19. https://doi.org/10.1142/S1469026816500152 |

[19] |
Eiben, A.E., Hinterding, R. and Michalewicz, Z. (1999) Parameter Control in Evolutionary Algorithms. IEEE Transactions on Evolutionary Computation, 3, 124-141.
https://doi.org/10.1109/4235.771166 |

[20] |
Kennedy, J. and Mendes, R. (2006) Neighborhood Topologies in Fully Informed and Best-of-Neighborhood Particle Swarm. IEEE Transactions on Systems, Man, and Cybernetics, Part C, 36, 515-519. https://doi.org/10.1109/TSMCC.2006.875410 |

[21] |
Yeh, C.Y., Huang, C.W. and Lee, S.J. (2011) A Multiple-Kernel Support Vector Regression Approach for Stock Market Price Forecasting. Expert Systems with Applications, 38, 2177-2186. https://doi.org/10.1016/j.eswa.2010.08.004 |

[22] |
Wu, J., Jin, L. and Liu, M. (2015) Evolving RBF Neural Networks for Rainfall Prediction Using Hybrid Particle Swarm Optimization and Genetic Algorithm. Neurocomputing, 148, 136-142. https://doi.org/10.1016/j.neucom.2012.10.043 |

[23] |
Wu, J. (2011) An Effective Hybrid Semi-Parametric Regression Strategy for Rainfall Forecasting Combining Linear and Nonlinear Regression. International Journal of Applied Evolutionary Computation, 2, 50-65.
https://doi.org/10.4018/jaec.2011100104 |

[24] | Wiegand, R.P. (2003) An Analysis of Cooperative Co-Evolutionary Algorithms. George Mason University, Fairfax, VA. |

[25] |
Vasan, A. and Raju, K.S. (2009) Comparative Analysis of Simulated Annealing, Simulated Quenching and Genetic Algorithms for Optimal Reservoir Operation. Applied Soft Computing, 9, 274-281. https://doi.org/10.1016/j.asoc.2007.09.002 |

[26] |
Babaei, M. and Sanaei, E. (2016) Multi-Objective Optimal Design of Braced Frames Using Hybrid Genetic and Ant Colony Optimization. Frontiers of Structural and Civil Engineering, 10, 472-480. https://doi.org/10.1007/s11709-016-0368-4 |

[27] |
Chang, C.C. and Lin, C.J. (2010) LIBSVM: A Library for Support Vector Machines.
https://www.csie.ntu.edu.tw/~cjlin/libsvm/matlab/ |

[28] |
Chen, K.Y. and Wang, C.H. (2007) Support Vector Regression with Genetic Algorithms in Forecasting Tourism Demand. Tourism Management, 28, 215-226.
https://doi.org/10.1016/j.tourman.2005.12.018 |

Journals Menu

Contact us

customer@scirp.org | |

+86 18163351462(WhatsApp) | |

1655362766 | |

Paper Publishing WeChat |

Copyright © 2023 by authors and Scientific Research Publishing Inc.

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.