On Optimal Non-Overlapping Segmentation and Solutions of Three-Dimensional Linear Programming Problems through the Super Convergent Line Series

The solutions of Linear Programming Problems by the segmentation of the cuboidal response surface through the Super Convergent Line Series methodologies were obtained. The cuboidal response surface was segmented up to four segments, and explored. It was verified that the number of segments, S, for which optimal solutions are obtained is two (S = 2). Illustrative examples and a real-life problem were also given and solved.


Introduction
Linear Programming (LP) problems belong to a class of constrained convex optimization problems which have been widely discussed by several authors: see [1] [2] [3].The commonly used algorithms for solving Linear Programming problems are: the Simplex method which requires the use of artificial variables and surplus or slack variables, and the active set method which requires the use of artificial constraints and variables.Over the years, a variety of line search algorithms have been employed in locating the local optimizer of response surface methodology (RSM) problems: see [4] and [5].Similarly, the active set and simplex methods which are available for solving linear programming problems also belong to the class of line search exchange algorithms.

T. Ugbe, P. Chigbu
The line search algorithm, which is built around the concept of super convergence, has several points of departure from the classical, gradient-based line series.These gradient-based line series do often times fail to converge to the optimum but the Super Convergent Line Series (SCLS) which are also gradient-based techniques locate the global optimum of response surfaces with certainty.Super Convergent Line Series (SCLS) was introduced by [6], and later used by [7] and [8].[9] modified the Super Convergent Line Series (SCLS) and used it to solve Linear Programming Problems, [10] applied Quick Convergent Inflow Algorithm to solve Constrained Linear Programming Problems on Segmented region, and [11] modified the "Quick Convergent Inflow Algorithm" and used it to solve Linear Programming Problems based on variance of predicted response.In [12], it was verified and established that the best number of segments is two (S = 2) for Linear Programming Problems, four (S = 4) for Quadratic Programming Problems, and eight (S = 8) for Cubic Programming Problems, for non-over-lapping segmentation of the response surface.The above algorithms compared favourably with other Line Search algorithms that utilize the principles of experimental design.
Other recent studies on line search algorithms for optimization problems are: [13] in which a modified version of line search for global optimization was proposed.The line search here uses a technique for the determination of random-generated values for the direction and step-length of the search.Some numerical experiments were performed using popular optimization functions involving fifty dimensions; comparison with standard line search, genetic algorithms and differential evolution were performed.Empirical results illustrate that the modified line search algorithm performs better than the standard line search and other techniques for three or four test functions considered.[14] focused on line search algorithms for solving large-scale unconstrained optimization problems such as quasi-Newton methods, truncated Newton and conjugate gradient.[15] proposed a line search algorithm based on the Majorize-Minimum principle; here, a tangent majorant function is built to approximate a scalar criterion containing a barrier function, which leads to a simple line search ensuring the convergence of several classical descent optimization strategies, including the most classical variants of non-linear conjugate gradient.[16] presented the fundamental ideas, concepts and theorems of basic line search algorithm for solving linear programming problems which can be regarded as an extension of the Simplex method.The basic line search algorithm can be used to find an optimal solution with only one iteration.[17] presented a performance of a one-dimensional search algorithm for solving general high-dimensional optimization problems which uses line search algorithm as subroutines.
In all the aforementioned works, none has gone beyond solving problems in two-dimensional spaces with segmentation.This paper is basically on obtaining optimal solutions and segmentation of Linear Programming Problems in three dimensional spaces of a cuboidal region.

Three Dimensional Non-Overlapping Segmentation of the Response Surface
The space, X , (the shape of a cube) is partitioned into subspaces called segments.These segments are non-overlapping with common boundaries.The space, X , is partitioned into S non-overlapping segments as follows: In Figure 1(a), the cube (experimental space) is partitioned into two segments, S 1 and S 2 , while in Figure 1 the maximum number of support points per segment.The number of support points per segment as given by [6] is ( ) , where n is the number of variables in the model, N k is the number of support points in N k segment.The support points per segment are arbitrarily chosen provided they satisfy constraint equations and do not lie outside the feasible region.

Rationale of the Segmentation
Design matrices are formed from the support points obtained from each of the segments created above.The segmentation of the response surface according to [6] is a rapid way of improving the average information matrix and obtaining the optimum direction vector.This is achieved by obtaining the linear combination of the information matrices from the different segments.The improved average information matrix (resultant matrix) is used to compute the optimum direction vector, which locates the optimum direction and the optimizer in a very short period or with one iteration.Without segmentation, information leading to the optimizer would have been obtained from only a fraction of the entire response surface.
With segmentation, more support points are available at the boundary of the feasible region.[18] [19] [20] have shown that a design formed with support points taken at the boundary of the feasible region is better than any other design with support points taken at the interior of the feasible region.
Theorem: The average information matrix resulting from pooling the segments using matrices of coefficients of convex combination is ( )

Definitions and Preliminaries
The Super Convergent Line Series (SCLS) is defined by [6] as X is the vector of the optimal values, is the optimal starting points, where 0 the ith row of the average information matrix, ( ) the inverse of the average information matrix; ρ is the step-length defined as , where d is the direction vector; T i C is the vector which represents the parameter of linear in- equalities; X is the starting point and i b is a scalar of the linear inequalities; N ζ is an N-point design measure whose support points may or may not have equal weights; Support points are pairs of points marked on the boundary and interior of the partitioned space which are picked to form design matrices; X  is the experimental space of the response surface that can be partitioned into segments such that every pair of support points in the segment is a subset of is the determinant of the information matrix; H i is the matrix of the coefficients of convex combination and is defined as , ; for the inverse information matrix in segment for the inverse information matrix in segment 2, where V 111 , V 122 , V 133 are the variances of the inverse information matrix of segment 1 and V 211 , V 222 , V 233 are variances of the inverse information matrix of segment 2, respectively.
The average information matrix, ( ) , is the sum of the product of the k information matrices and the k matrices of the coefficients of convex combinations, thus ( ) Segmentation is the partitioning of the experimental space, X  , into segments.
Segmentation can be non-overlapping and overlapping, and support points are selected from each segment to form design matrices.
An unbiased response function is defined by ( )

Algorithm for Super Convergent Line Series
The algorithm follows the following sequence of steps: 1) Partition the experimental space (Cube) into segments and select N k support points from the kth segment; hence, make up an N-point design, ( ) , , , , , , ; ., , , , , Yes: stop, No: then go back to 1) above until the optimal solution is obtained.
5) Identify the segment in which the optimal solution is obtained.

The Average Information Matrix, the Direction Vector, the Starting Point and the
Step-Length

The Average Information Matrix
The average information matrix, ( ) n M ζ , is the sum of the product of the k in- formation matrices, and the k matrices of the coefficients of convex combinations given by ( ) for two segments, the average information matrix is ( )

The Direction Vector
The direction vector defined in Section 3.1.1 is computed as follows: If f(x) is the response function, then the response vector, Z, is given by By normalizing such that *T * 1 d d = , we have , where d 0 = 1 is discarded.

Optimal Starting Point
The optimal starting point is obtained from the design matrices of the segments considered.The optimal starting point defined in Section 3.1.1 is obtained as follows: Using a 4-point design matrix,

The Step-Length
The step-length is defined by , where ρ * is the optimal step-length and d * is the normalized direction vector, C is the vector which represent the parameter of linear inequalities, X * is the starting point while i b is a scalar of linear in- equalities., , 0 x x x ≥ Support points are picked from the boundaries of the partitioned segments (Figure 2) provided they do not violate the constraint equations.

Results and Discussion
where X 1 and X 2 is obtained from S 1 and S 2 respectively.

Illustrative Problem and Application
A producer of leather shoes makes three types of shoes, X, Y and Z, which are processed on three machines, K 1 , K 2 and K 3 .The daily capacities of the machines are given in Table 1 as follows.
The profit gained from shoe X is ₦3 per unit, shoe Y is ₦5 per unit and shoe Z is ₦4 per unit.What is the maximum profit for the three types of shoe produced?
Solution: Let X 1 be the unit of type X, X 2 be the unit of type Y and X 3 be the unit of type Z. Maximize In a similar manner, the design and inverse matrices are given as follows [from Figure 4].     .Therefore, the maximum value of Z is ₦18.88.This value in one iteration is close to the optimum value got by using the simplex method approach (in three iterations).When 3 and 4 segments were used, the maximum values of Z for this problem are ₦21.25 with corresponding values (X 1 , X 2 , X 3 ) = (1.37,2.08, 1.68) and ₦21.03 with corresponding values (X 1 , X 2 , X 3 ) = (1.43,2.01, 1.67).These values are not optimal because they do not compare favourably with the simplex method solution which is Max Z = ₦18.66.

Conclusion
Three dimensional Linear Programming problems have been solved using the line search equation, X d ρ * * * − , of the Super Convergent Line Series, by segmenting the cuboidal response surface into 2, 3 and 4 segments.A real-life problem was also used to achieve the desired result.It was found that the optimal solution is attained at 2 segments (S = 2) and in one iteration or move even though up to 4 segments (S = 4) were considered.But comparing the solution with the simplex method's result, a close result was obtained in 2 and 3 iterations.Hence, as the name implies, the Super Convergent Line Series (SCLS) locates the optimizer in one iteration and better still with segmentation.

Figure 1 .
Figure 1.(a): A vertical line, Ƨ, drawn through the middle of a Cube [Two segments (S = 2)].(b): A vertical line, Ʈ, and a horizontal line, ƥ, draw through the middle of a cube [Three Segments (S = 3)].(c): A vertical line, Δ, and a horizontal line, Ԓ, drawn through the middle of a cube [Four Segments (S = 4)].
direction vector defined in Section 3.1.1 is computed as

1
2 segments, the coefficients of convex combinations, H i , of the

Table 1 .
The daily capacity of the machines.