^{1}

^{*}

^{2}

^{*}

The optimization of discrete problems is largely encountered in engineering and information domains. Solving these problems with continuous-variables approach then convert the continuous variables to discrete ones does not guarantee the optimal global solution. Evolutionary Algorithms (EAs) have been applied successfully in combinatorial discrete optimization. Here, the mathematical basics of real-coding Genetic Algorithm are presented in addition to three other Evolutionary Algorithms: Particle Swarm Optimization (PSO), Ant Colony Algorithms (ACOA) and Harmony Search (HS). The EAs are presented in as unifying notations as possible in order to facilitate understanding and comparison. Our combinatorial discrete problem example is the famous benchmark case of New-York Water Supply System WSS network. The mathematical construction in addition to the obtained results of Real-coding GA applied to this case study (authors), are compared with those of the three other algorithms available in literature. The real representation of GA, with its two operators: mutation and crossover, functions significantly faster than binary and other coding and illustrates its potential as a substitute to the traditional optimization methods for water systems design and planning. The real (actual) representation is very effective and provides two near-optimal feasible solutions to the New York tunnels problem. We found that the four EAs are capable to afford hydraulically-feasible solutions with reasonable cost but our real-coding GA takes more evaluations to reach the optimal or near-optimal solutions compared to other EAs namely the HS. HS approach discovers efficiently the research space because of the random generation of solutions in every iteration, and the ability of choosing neighbor values of solution elements “changing the diameter of the pipe to the next greater or smaller commercial diameter” beside keeping good current solutions. Our proposed promising point to improve the performance of GA is by introducing completely new individuals in every generation in GA using a new “immigration” operator beside “mutation” and “crossover”.

Global optimization is the task of finding the absolutely best set of admissible conditions to achieve an objective under given constraints, assuming that both are formulated in mathematical terms [

We present in this paper the WSS optimization as an example of combinatorial discrete optimization problems. The optimization algorithms used to solve this type of problem are divided into two categories the classical algorithms and the Evolutionary Algorithms EAs which are more popular. We highlight the main differences between the two categories, and then we concentrate on the last category. We demonstrate the mathematical basics of four EAs that have been used successfully to solve WSS optimization problems (Genetic Algorithms GAs, Particle Swarm Optimization PSO, Ant Colony Algorithm ACOA and Harmony Search HS). We adopt the real Genetic Algorithm, as it is one of the fundamental evolutionary algorithms and the most used nowadays. We compare the real GA as mathematical construction, results and efficiency with the three more-recent algorithms, taking the New-York City WSS as a case study.

The complexity of water supply systems comes from multiple pumps, valves and reservoirs, head losses, large variations in pressure values, several demand loads, etc. In such systems, finding the global optimal solution is a difficult task using the conventional trial and error methods, while the incomplete mathematics precisely, the innovative optimization algorithms are more widely explored in optimization process of the water supply systems [

Design optimization problems in WSS are based on searching the system characteristics which minimise the total system cost without affecting the proper operation of the hydraulic system and the consumers supply. This means that the system must be economic and reliable. However, increase in reliability can imply higher costs [

In most works dealing with optimization design of WSS, some benchmark networks have been constantly used for the comparison between distinct developed methodologies, from which we mention the New York City Tunnels firstly presented by Schaake and Lai 1969 [

As a common practice, due to the nature of WSS, one uses an objective function which only takes into account the pipeline costs and penalty costs, if not complying with minimal values for the pressure at each node of the network and the objective function can be written as follows [

where i is the pipe number (1 ® i ® S, S is the number of all pipes i.e. the number of variables), _{i} is the length of pipe i. The penalty Pe only applies when the pressure in any node is less than a predetermined minimal value. In order to reveal the feasible solutions only, the hydraulic simulations is required to check pressure constraints and to obtain the flow velocity in each pipe, and if the resulting velocity in one pipe is larger than the determined threshold velocity, this pipe diameter is increased to the next larger commercial size. Hydraulic simulators are coupled successfully with the optimization algorithms to evaluate each solution.

Global optimization algorithms can be distinguished according to two general classiﬁcations: 1) classical algorithms, based essentially on the computation of the objective function gradient and/or function evaluations and 2) Evolutionary /Meta-heuristic algorithms, consisting essentially on exploratory search and generally based on a phenomena that occur in nature or even based on artiﬁcial intelligence.

The classical algorithms used in Water Supply Systems as classified in [

On the other hand, there are the incomplete mathematics presented by the Evolutionary /Meta-heuristic algorithms. These innovative techniques have shown great potential and good perspective for the solution of various optimization problems. Due to the exploratory nature of the heuristic algorithms, the probability of finding global optimal solutions using these advanced techniques is higher and they provide the advantages of not requiring derivatives calculations and not relying on the initial decision variables; but their main disadvantage is related to the higher computational effort [

Likewise, evolutionary algorithms are usually the most used for solving multi-objectives problems. They deal with a set of solutions during the search procedure, allowing to obtain a set of optimal solutions in a single run, while the classic complete methods only lead to a single solution and cannot guarantee the generation of different points based on the optimal global solution.

Evolutionary algorithms (EAs) have shown considerable success in solving global optimization problems; thus, it has attracted much attention in the past few years. Recently, a large number of EAs, such as genetic algorithms (GAs), particle swarm optimization (PSO), Ant Colony Algorithm (ACOA) and Harmony Search (HS) have been proposed.

Here in, we state the most promising incomplete mathematics algorithms used in solving global optimization problems such as the water supply systems:

From the group of heuristic algorithms, it is usual to find works mostly applying the (GA). The theory behind GAs was proposed by Holland (1975) [

There are many variations of GAs, but the following general description encompasses most of the important features. The analogy with nature requires creation by computer of a set of solutions called a population. Each individual in a population is represented by a set of parameter values that completely propose a solution. These are encoded into chromosomes, which are originally sets of character strings, analogous to the chromosome found in DNA. The GA search, sometimes with modification, has been proven to perform efficiently in a large number of applications. The selection of an appropriate chromosome representation of candidate solutions to the problem at hand is the foundation for the application of genetic algorithms to a specific problem. With a sufficiently large population of chromosomes, adequate representation will be achieved.

The typical size of the population for a GA can range from 30 to 1000. In addition, GAs use probabilistic (not deterministic) transition rules, and they also use objective function information, not derivatives or other auxiliary knowledge [

In the WSS optimization problem, every pipe diameter is considered like a gene that has as many options as the available commercial diameters, while the whole network is considered as the chromosome “individual” that should “fit” to the environment presented by the objective function and the survival chances for the possible solutions or individuals in the population are bigger for those who propose the least cost (

Returning to the objective function in Equation (1), the penalty is applied when infeasible solutions, failed to meet the minimum nodal pressure requirements. This pressure is 30 m above ground level in the case of New-York city WSS. However, the infeasible solutions are not removed from the population. Instead they are allowed to join the population and help guide the search, but with penalty term incorporated in objective function. The penalty activation reduces the strength of the infeasible solution relative to other solutions in the population consequently leading to the degradation of its fitness. The penalty is a function of the distance from feasibility. The bigger the difference between the nodal pressure and the required nodal pressure, the greater the penalty is. The maximum violation of the pressure constraint is multiplied by the penalty multiplier Pm, which is a function of the generation number, which allows a gradual increase in the penalty term:

The penalty multiplier Pm should be a value that does not allow the best infeasible solution to be better than any feasible solution in the population. As we note from Equation (2), when the generation number approaches the maximum determined number of generations Gmax, the penalty value increases. The selected appropriate values of κ and δ in Equation (2) are 0.8 and 100,000 respectively.

The application of real GA on the New-York network, hereafter, gives more details about the method with comparison to other evolutionary algorithms.

PSO is a relatively new category of stochastic, population-based optimization algorithms that draw inspiration from the collective behavior and emergent intelligence that arise in socially organized populations. Within the PSO algorithm, every solution is a bird of the flock and is referred to as a particle: in this framework the birds, besides having individual intelligence, also develop some social behavior and coordinate their movement towards a destination [

For nodes with pressures larger than this minimal value, the associated individual penalties vanish, and one uses the usual Heaviside step function H in the explicit expression for Pe:

The factor p represents a fixed value that becomes effective when

Initially, the process of finding the optimal solution starts from a swarm of particles (we can take a number of particles 1 ® n ® N, N = 100 for example), in which each of them contains a solution needs to be improved by iteration. The n-th particle is associated with a position in an S-dimensional space, where S is the number of variables involved in the problem (the number of pipes in the case of WSS). The values taken by each variable of the S variables (the diameter j of each pipe in the case of WSS) represent a possible solution of the optimization problem. Each particle n is completely determined by three vectors: its current position, in iteration t,

Current position

which reveals the diameter j of each pipe i, presented by the particle n in the iteration t. So, in every iteration we have N particles and thus N solution.

Best previous position

Flight velocity

The flight velocity depends on the position of the current “solution” comparing to the best solution of all the particles in the same iteration and the best previous solution as explained in Equation (7).

This algorithm simulates a flock of birds which communicate during flight. Each bird examines the search space from its current local position, and this process repeats until the bird possibly reaches the desired position. The birds learn through their own experience (local search) and the experience of their peers (global search).

In every iteration, one identifies the particle which has the best instantaneous solution to the problem; the position of this particle subsequently enters into the computation of the new position for other particles in the flock. This heuristic calculation is carried out according to:

where the primes denote new values for the variables, and the new velocity is given by:

Here, c_{1} and c_{2} are two positive constants called learning factors; c_{1} = 3, c_{2} = 2, in [

w is a factor of inertia suggested by Shi and Eberhart [

where t is the iteration number.

Finally, Y^{*} is the best present solution of all Yi.

Once the current position is calculated, the particle directs itself towards a new position.

To control any change in the particle velocities, we introduce the respective upper and lower limits:

where Vsup = 50% of the total variable range and Vinf = −Vsup.

This algorithm can be considered as the standard PSO algorithm, which is applicable to continuous systems and cannot be used for discrete problems. In discrete problem, the initial position vectors are generated with integer values, so if the flying velocity vector components are integer values then the new position vectors will be integer values as well. According to this scheme, Equation (7) turns out to be:

where

Thus, The PSO algorithm after some modifications can adapt to problems with discrete variables. The approaches which were presented in the works of Montalvo et al. 2008, 2010 [

However, the choice of the PSO algorithm’s parameters (such as the group’s inertia and acceleration coefficients) seems to be very important for the speed, convergence and efficiency of the algorithm [

Ant colony optimization algorithms are inspired by the fact that ants are able to find the shortest route between their nest and a food source. Ants deposit pheromone trails whenever they travel. The path taken by individual ants from the nest in search for a food source is essentially random [

In order to apply ACOAs to WSS optimization problems, the combinatorial optimization problem under consideration has to be able to be mapped onto a graph

Once the problem has been defined in the terms set out above, the ACOA can be applied. The main steps in the algorithm are as following:

1) Trial solutions are constructed incrementally as artificial ants move from one decision point to the next until all decision points have been covered.

2) The cost of the trial solution generated is calculated. The generation of a complete trial solution and calculation of the corresponding cost is referred to as a cycle n. Every ant draws one cycle in iteration, this is why we keep the same abbreviations, n for particle’s number, and N for total particles number in PSO and in ACOA.

3) Pheromone is updated after the completion of one iteration t, which consists of N cycles “N is the number of ants used”. In other words, the construction of N trial solutions means that we have N ants that made N tours or cycles, each one of them presents a solution.

where

When one iteration is completed, then N trial solutions have been constructed, the pheromone trails are updated in a way that reinforces good solutions. The general form of the pheromone update equation is as following [

where

where n^{*} is the cycle number that results in the best solution during iteration t.

where R is pheromone reward factor; _{pher} is pheromone penalty factor and ΔH_{max} is maximum pressure deficit in the WSS, which is obtained using a hydraulic simulator for each trial solution. Equation (14) clarifies that the amount of pheromone added to each of the options chosen during a cycle is a function of the cost of the trial solution obtained; the better the trial solution, and hence the lower the cost, the larger the amount of pheromone added, and thus the more likely the concerned solution options will be chosen by other ants in next iterations [

The harmony search algorithm was conceptualized from the musical process of searching for a “perfect state” of harmony, such as jazz improvisation, just as the optimization algorithm seeks a best state (global optimum) determined by evaluating the objective function. The harmony quality is enhanced practice after practice, just as the solution quality is enhanced iteration by iteration [

As in other optimization processes in the water network design, the objective function is the pipe cost function; the pipe diameter is the decision variable; the number of decision variables is the number of pipes in the network; the set of decision variable values is the range of possible candidate diameters.

The steps in the procedure of harmony search are as follows [

Step 1. Initialize the problem and algorithm parameters. The HS algorithm parameters are also specified in this step. These are the harmony memory size (HMS), or the number of solution vectors in the harmony memory which is equivalent to the number of particles N in previous algorithms; harmony memory considering rate (HMCR); pitch adjusting rate (PAR); and the number of improvisations (NI), or stopping criterion. The harmony memory (HM), is a memory location where all the solution vectors (sets of decision variables) and corresponding objective function values are stored. The function values are used to evaluate the quality of solution vectors. This HM is similar to the genetic pool in the GA. In the study of Geem 2006 [

Step 2. Initialize the harmony memory.

In Step 2, the HM matrix is filled with randomly generated solution vectors. Each solution undergoes a hydraulic analysis to verify that it satisfies the minimum pressure requirements at each node. As in other Evolutionary algorithms, the penalty in the objective function will cause that the infeasible solutions that violate the minimum requirements, become uneconomic but still have a chance to be included in the HM in the hope of forcing the search towards the feasible solution area.

Step 3. Improvise a new harmony. A new harmony vector,

Every component obtained by the memory consideration is then examined to determine whether it should be pitch-adjusted. This operation uses the PAR parameter. If the pitch adjustment decision for

In this step, the application of HM consideration, pitch adjustment, or random selection to each variable of the new harmony vector in turn, help the algorithm find globally and locally improved solutions, respectively.

Step 4. Update the harmony memory. If the new harmony vector

harmony vector in the HM, judged in terms of the objective function value, and no identical harmony vector is stored in the HM, the new harmony is included in the HM and the existing worst harmony is excluded from the HM. For the water network optimization problem, the new solution vector also undergoes a hydraulic analysis, just the same as in Step 2.

Step 5. Steps 3 and 4 are repeated until the number of improvisations (NI) is reached.

However, other techniques have also been investigated in this context, such as: Tabu Search (TS), Simulated Annealing (SA) and Shuffled Complex Evolution (SCE).

Given an existing network like the New-York City network shown in

For each of these pipes, choose a diameter within the available diameter set [

Because of age and increased demands, the existing gravity flow tunnels in New-York City have been found to be inadequate to meet the pressure requirements in some nodes (nodes 16, 17, 18, 19 and 20 shown in ^{21} = 1.93 × 10^{25} possible network designs.

In order to solve this optimization problem in real-coding GA algorithm, a real string made of 21 substrings (genes) is used for coding the 21 decision variables “21 pipes that can take one of the 16 discrete diameters”. The fitness of the chromosome that consists of 21 genes is computed through the objective or evaluation function which determines the cost of a solution by summing the cost of the pipes making up the network. Simulation of the network flows and pressure heads is then carried out through EPANET computer program to assess the feasibility of solutions [

Many optimization works were done in the literature for the case of New York City Tunnels optimization. Basing on the literature review made by Coelho and Andrade-Campos [

Method and ref. | Research space (N) | Parameters | Notations in the text | Iterations (t) |
---|---|---|---|---|

Real GA | 100 “individual” | The probability of crossover = 0.8 The probability of mutation = 0.1, κ = 0.8, δ = 100,000 | Number of generations allowed = 2500 | |

PSO [ | 100 “particle/bird” | c_{1} = 3, c_{2} = 2, | Equations (8) and (11) | Number of flights = 100 |

ACOA [ | 100 “ant” | α = 3.5, β = 0.5, ρ = 0.95, P_{pher} = 0.1, R = 15,000,000 | Equations (12) and (15) | Number of iterations = 500 |

HS [ | 50 “musical harmony” | HMCR = 0.9, PAR = 0.1 | Number of improvisations = 20,000 |

GA: Genetic Algorithms, PSO : Particle Swarm Optimization, ACOA : Ant Colony Algorithm, HS: Harmony Search.

Pipe number | Diameters of parallel pipes, inches | |||
---|---|---|---|---|

ACOA [ | HS [ | Real GA 1 Design III | Real GA 2 Design IV | |

1 | 0 | 0 | 0 | 0 |

2 | 0 | 0 | 0 | 0 |

3 | 0 | 0 | 0 | 0 |

4 | 0 | 0 | 0 | 0 |

5 | 0 | 0 | 0 | 0 |

6 | 0 | 0 | 0 | 0 |

7 | 144 | 96 | 0 | 0 |

8 | 0 | 0 | 0 | 0 |

9 | 0 | 0 | 0 | 0 |

10 | 0 | 0 | 0 | 0 |

11 | 0 | 0 | 0 | 0 |

12 | 0 | 0 | 0 | 0 |

13 | 0 | 0 | 0 | 0 |

14 | 0 | 0 | 0 | 0 |

15 | 0 | 0 | 120 | 96 |

16 | 96 | 96 | 84 | 96 |

17 | 96 | 96 | 96 | 96 |

18 | 84 | 84 | 96 | 84 |

19 | 72 | 72 | 84 | 72 |

20 | 0 | 0 | 72 | 0 |

21 | 72 | 72 | 72 | 72 |

Cost $ M | 38.64 | 36.66 | 38.8 | 38.5 |

Diameter design | Discrete | Discrete | Discrete | Discrete |

Number of Evaluations | 13,928 | 6000 | 45,904 | 116,805 |

Critical node | Minimum Allowable Head (ft) | Head difference from minimum (ft) | |||
---|---|---|---|---|---|

ACOA | HS | Real GA1 | Real GA2 | ||

16 | 260.00 | 0.08 | 0.15 | 0.59 | 0.02 |

17 | 272.80 | 0.07 | 0.00 | 0.11 | 0.00 |

19 | 255.00 | 0.05 | 0.22 | 0.78 | 0.17 |

Used Hydraulic Simulator | WADISO/EPANET | EPANET | EPANET | ||

Feasibility | F | F | F | F |

there are other evolutionary algorithms, in the available literature, that have proposed the same cost as Design I (38.64 M$): PSO [

Regarding the four designs proposed by the evolutionary algorithms, the real GA affords two feasible designs “the first after 45,904 evaluations and the second after 116,805 evaluations” with costs comparable to the costs of designs I and II. The availability of alternatives with nearly equivalent costs has a big advantage in practice, because that enables the engineers to take into account other terrain factors to clearly make a choice on the adopted design.

Comparing the performance of the three algorithms in

Evolutionary algorithms (EAs) have shown considerable success in solving global optimization of discrete problems and have proved to be a valuable tool in the search for alternatives, and not only a unique solution.

The real-coding Genetic Algorithm GA is based on the phenomena of natural selection, the possible solutions are seen as individuals “chromosome” and they are evaluated basing on their fit to the objective function. The conventional GA uses two main operators applied on the initially generated population and the next generations (mutation and crossover). The real GA approach has considerable flexibility in application to nonlinear problems and to complex systems. In addition, a real GA will generate several near-optimal solutions.

The second presented evolutionary algorithm is the Particle Swarm Algorithm (PSO). Within the PSO algorithm, every solution is a bird of the flock “a particle”. Birds develop some social behavior and coordinate their movement towards a destination which is the objective function. The flight velocity depends on two terms with random portions: the position of the current “solution” comparing to the best solution of all the particles in the same iteration, and the best previous solution.

The third presented evolutionary algorithm is the Ant Colony Algorithm (ACOA).This algorithm is inspired by the fact that ants are able to find the shortest route between their nest and a food source. The ants have tendency to choose paths where other ants have crossed. They can recognize these paths by pheromone intensities deposed by the traveling ants. In the ACOA, the pheromone concentration is an auxiliary parameter used to help artificial ants to make their options.

The fourth presented evolutionary algorithm is Harmony Search (HS). (HS) conceptualized from the musical process of searching for a “perfect state” of harmony, such as jazz improvisation. The harmony memory (HM), is first filled by solution vectors. Improvise a new harmony is generated based on three rules: 1) memory consideration, 2) pitch adjustment, and 3) random selection.

A considerable amount of work is done in this paper for unifying the symbols and notations in the mathematical description of the four mentioned EAs in order to be comparable and easy to understand.

The New-York City WSS tunnels network for WSS, is taken as an example to compare the real GA with PSO, ACOA and HS. The results of the three last EAs are stated in literature. Primarily, the comparison while preparing each algorithm to be applied to a specific case study, reveals that the ACOA needs the biggest number of parameters to be calibrated (5 parameters), compared with 4 parameters in GA, 2 parameters in PSO and HS. Four distinguished designs are offered by the EAs. However, the detailed results of PSO are not included in the published work of Montalvo et al. 2008 [

The superiority of Harmony search in results and performance is probably due to the competent search process illustrated in

Finally, we conclude that the four proposed EAs were able to find four near-optimal designs for the same discrete optimization problem with close costs. The existence of these alternatives is a major advantage in practice, where other factors can be taken into account. The time needed by each EAs to reach the optimal or near-op- timal global solutions varies considerably depending on its mathematical construction and the choice of parameters involving. The development of EAs nowadays is moving towards improving their performance and calculation time, since their robustness has been proven.

Adel R.Awad,Samia O.Chiban, (2015) Recent Advances in Global Optimization for Combinatorial Discrete Problems. Applied Mathematics,06,1842-1856. doi: 10.4236/am.2015.611162