1. Introduction
The traveling salesman problem (TSP) has origin in 1832 in a hand book of Hamilton. It is a NP-complete problem and NP-hard problem and till now only some special cases are found to be polynomially executable with a reducible Turing machine. I propose a method which solves the general case in polynomial cost of the input size. To today I read [1] , an annealing algorithm about TSP [2] , and about the bottleneck TSP but I have also the opportunity to read [3] about another NP-complete problem, the job shop scheduling, with which I measure the size of NP problem. This problem and consequent solution are really important for industry.
Problem Definition and Notations
The TSP can be definined like the problem for salesman to go throught each cities a, b, c, ∙∙∙, v, z passing once on them and returning the shortest travel distance as the sum of the weight of the arcs joining cities that are visited.
2. Method
Consider a polyhedron. Cities are disposed on the axis, so a, b, c, ∙∙∙, z are positions on the axis of the cities from a common origin in the space (a center) and therefore cities are on vertex of polyedron (the limit of 26 letter doesn’t mean limit of 26 cities, that’s valid to all functions involved). The distance beetween two cities is the weight of the arc joining them. A cycle on the polyhedron that pass once on each city is a flower with the corolla as origin (center) and the coordinate axis of cities petals. To find a, b, ∙∙∙, x, z we must solve the linear system for each variable that define the position on axis:
The equation for node a is. We compute;
having N variable in N equations and obtain will give easy access to.
2.1. Theorem
Be the set of petals. Be the minimum and then;
. The optimal sequence for petals is given by
(1)
2.2. Proof
In this section we will consider the petals a, b, c, ∙∙∙ as variables.
(A)
Consider an inequality on functions for the swap of two elements
(B)
with so with equality of the
is higher at lower, therefore if with. Further consider that must be soddisfied from one value of to one value of and therefore and is a stronger relation togheter with and. So minimum with maximum with a complete satisfy condition is the lowest:. This condition can be extended to more than two elements so for example minimum if
(Optimality Condition OC 1).
2.2.1. Case of Four Dimensions
So if we have we start from the maximum so that for condition OC 1 and. The swap of with is indifferent because we have not prevalent constrains so solutions are and. Those are symmetric respect to the minimum.
2.2.2. Case of More Dimensions
So if we have. In such sets we start from the maximum
so that, for condition OC 1, elements and are maximum. Here we will find only a symmetric solution considering. Further therefore, for condition OC 1,
from the confront of others elements not sequenced; so we have to choose the order for, but
because notice that we have an even number of elements, two at high adjacent values
and two at lower adjacent values; the behaviour of down level is symmetric and specular of the choice of up level; the down level can have a good move symmetrically equivalent from uplevel choice whereas the uplevel depends on the choice, and so is convinient to have a good choice at up level (Optimality Condition OC 2). Further and therefore, for condition OC 1, butfor condition OC 2,. So the sequence became, with minimum.
So the order is the best solution. Therefore the solution with elements at right and left of the minimum in n-dimensions case is that for simmetry is the same than
. Then (1) is true.
2.3. Algorithm
Node: a list of nodes
Solve the linear system for a, b, ∙∙∙, z obtain node
While do
;
for
if then
;
;
end if
end for
if then result.add (position(last),next);
else if then
result.add (position(init),next);
else if
end while
Test
The algorithm has been tested on several instances 9 × 9 and in all of them the best solution is found in few seconds.
3. Complexity
The linear system can be found in with Gauss method [4] wheras the cycle is so the method is of. Then P = NP.
4. Discussion
The complexity can be reduced also changing the criteria to achieve an objective.
Appendix
Method to Solve the Linear System, Jacobi
with x vector of solution and b vector of costant terms. Choose so that P easy to invert.
With and stop condition dependent from that must be tunned depending for example from the minimum x: [5].