On the Convergence of the Dual-Pivot Quicksort Process

Sorting an array of objects such as integers, bytes, floats, etc. is considered as one of the most important problems in Computer Science. Quicksort is an effective and wide studied sorting algo-rithm to sort an array of n distinct elements using a single pivot. Recently, a modified version of the classical Quicksort was chosen as standard sorting algorithm for Oracles Java 7 routine library due to Vladimir Yaroslavskiy. The purpose of this paper is to present the different behavior of the classical Quicksort and the Dual-pivot Quicksort in complexity. In Particular, we discuss the convergence of the Dual-pivot Quicksort process by using the contraction method. Moreover we show the distribution of the number of comparison done by the duality process converges to a unique fixed point.


Introduction
Quicksort is one of the important sorting algorithms.Hoare [1] proposed an algorithm depended on selecting an arbitrary element from the array.This element called a pivot element such that Quicksort algorithm used for parting the arrays into two sub-arrays: those smaller than the pivot and those larger than the pivot [2].
After that Quicksort depends on recursive sorting of the two subarrays.Later Sedgewick studied several variants.Regnier [3] studied the limiting distribution of the number of comparisons done by Quicksort algorithm when suitably normalized.It converges with uncertain unknown limit.The first accounts were computed by Hennequin who proved that this distribution is not a normal distribution.The limiting distribution is characterized by a stochastic fixed point equation [4] [5].The cost of the Quicksort algorithm depends on the position of the se-lected pivot.There are many cases to choose the pivot element.The worst-case, the best-case and the average case express the performance of the algorithm.We will discuss some of them and for more details; we refer to Ragab [6] and [7].The worst-case occurs when the pivot is the smallest (or largest) element at partitioning on array of size n, yielding one empty sub-array, one element (pivot) in the correct place and one sub-array of size n − 1.So, the two sub-arrays are lopsided so this case is defined by worst case [8].We found the recursion depth is n − 1 levels and the complexity of Quicksort is ( )

2
O n .The best case occurs when the pivot is in the median at each partition step, i.e. after each partitioning, on array of size n, yielding two sub-arrays of approximately equal size and, the pivot element in the middle position takes n data comparisons [9].There are various methods to choose a good pivot, like choosing the First element, Last element, and Median-of-three elements (selection three elements, and find the median of these elements), and so on.In this case the Quicksort algorithm selects a pivot by random selection each time.This choice reduces probability that the worst-case ever occurs.The other method, which essential prevents the worst case from ever occurring, picks a pivot as the median of the array each time.When we chose the pivot, we compare all other elements to it and we have n − 1 comparisons to divided the array.The choosing of the pivot divided the array into one sub-array of size 0 and one sub-array of size n − 1, or into a sub-array of size 1 and other one of size n − 2, and so on up to a sub-array of size n − 1 and one of size 0. We have n possible positions and each one is equality in probability 1/n.Hennequin studied comparisons for array by using Quicksort with r pivots when r = 2, same comparisons as classic Quicksort in one partitioning.When r > 2, he found the problem is complied.Yaroslavskiy [10] introduced a new implementation of Dual-pivot Quicksort in Java 7's runtime library.In 2012, Wild and Nabel denoted exact numbers of swaps and comparisons for Yaroslavskiy's algorithm [10].In this paper, our aim is to analyze the running time performance of Dual-pivot Quicksort.The limiting distribution of the normalized number of comparisons required by the Dual-pivot Quicksort algorithm is studied.It is known to be the unique fixed point of a certain distributional transformation T with zero mean and finite variance.
We show that using two pivot elements (or partitioning to three subarrays) is very efficient, particularly on large arrays.We propose the new Dual-pivot Quicksort scheme, faster than the known implementations, which improves this situation (see in [11] and [12]).The implementation of the Dual-pivot Quicksort algorithm has been inspected on different inputs and primitive data types.
The new Quicksort algorithm uses partitioning a source array [ ] T g , where g is primitive array which we need to sort it.Such as int, float, byte, char, double, long and short, to three parts defined by two pivot elements p and q (and therefore, there are pointers A, B, C and left and right indices of the first and last elements respectively).The aim of this paper is topresent such a version arising from an algorithm depending on the work in [13] and [14].The Dual-pivot Quicksort is explained clearly in [15] and it works as follow: 1) For small arrays (length < 17), use the Insertion sort algorithm [10].
2) Choose two pivot elements p and q.We can get, for example, the first element [ ] g left as p and the last element [ ] g right as q.
3) p must be less than q, otherwise they are swapped.So, we have the following parts.• Part I with indices from left + 1 to A − 1 with elements, which are less than p.
• Part II with indices from A to B − 1 with elements, which are greater or equal to p and less or equal to q.
• Part III with indices from C + 1 to right − 1 with elements greater than q.
• Part IV contains the rest of the elements to be examined with indices from B to C.

4) The next element [ ]
g B from the part IV is compared with two pivots p and q, and placed to the corres- ponding part I, II, or III.
5) The pointers A, B, and C are changed in the corresponding directions.
6) The steps 4 -5 are repeated while B C ≤ .7) The pivot element p is swapped with the last element from part I, the pivot element q is swapped with the first element from part III.
8) The steps 1 -7 are repeated recursively for every part I, part II, and part III as in Figure 1.
Figure 1.Graph explains the dual-pivot quicksort algorithm.

Run-Time Performance
In this section, we introduce some running time of the Dual-pivot Quicksort.An efficient procedure is described by Vasileios Iliopoulos and David B. Penman [13], where they analyzethe Dual pivot Quicksort algorithm.Their approach can be here provided and for more detailswe refer to [13] and [14].First we introduce the algorithm of it and we compare between it and the classical Quicksort as follows [16].
The following graphs show the relation between the size of array which need to sort and the time of complexity which represent by the number of comparisons and swaps as in Figure 2. We found the Dual-pivot Quicksort is faster than classical Quicksort.

The Dual-Pivot Quicksort Average Case Analysis
To find the distributional equation, we note the following: for the underlying process, there are two parts.The first part is partitioning and the second is the total number of comparisons to sort an array of 2 n ≥ keys, when the pivot is a uniform random variable { } 1, 2,3, , n  is equal to the number of comparisons to sort the subarray of on 1 1 n U − keys below the first pivot [17].
In addition, we need to compute the number of comparisons to sort the sub-array of where the random variables elements greater than the second pivot.The algorithm is then recursively applied to each of these subarrays.The number of comparisons during the first stage is [11], the average value of n A can be calculated as fol- low:

Expected Number of Comparisons
Here by Equation (1) and using [13], it is easy to determine the recurrence for the expected number of comparisons due to the duality as follow: Since the three double sums above are equal, then the recurrence becomes We introduce a difference operator for the solution of this recurrence.The operator is defined by And for higher orders ( ) ( ) ( ) a a a a a Dividing by ( )( ) By using maple V. Iliopoulos and D. B. Penman [13] get And for the other sums in Equation ( 3): Now the equation becomes  Finally, the expected number of comparisons, when two pivots are chosen is ( ) ( ) where n H is the harmonic number defined by 1 1 : [18] and [19]).This is the same value of the expected number of comparisons, when one pivot chosen in the classical Quicksort [20].Note that this result for the dual Quicksort is identical with theexpected number of comparisons in [13].

Varience of Comparisons
The main result of this section was obtained by [13] (see following results for explanationand notation).Now we compute the variance of the number of comparisons by Dual-pivot Quicksort.Recall that ( ) From Equation (1), we have ( ) ( ) noting that the resulting subarrays are independently sorted, then we get ( ) ( ) ( ) ( ) 2 ∑ be the ordinary probability generating function for the number of comparisons needed to sort n keys, we obtain ( ) ( ) ( ) ( ) It holds that ( ) ( ) ( ) − .Moreover, the second order derivative of Equation ( 6) By simple manipulation of indices, the sums of the products of expected values are equal.The double sum of the product of the mean number of comparisons can be simplified as follows: 1 .

∑
By using the identity [4] ( ) ( ) It holds that And our recurrence becomes , the variance of the number of key comparisons of the Dual-pivot Quicksort is (see [17] [19] and [20]) , where ( )

H
is the second order harmonic number defined by (see [18] and [19])

Asympototic Distribution
In this section, we show the convergence results which are essential for the main purpose.Defining a random variables ( ) Equation ( 8) can be rewritten in the following form ( ) ( ) By a simple manipulation, one gets ( ) where the cost function , is given as and it seems to be like in [6] and [7], and given by ( ) .
Now, we show the random vector 1 2 , converges to a uniformly distributed random vector ( ) , , .
For the cost function .
By using asymptotically, the expected complexity of Dual-pivot Quicksort is 2 log n n given in Equation ( 4), it follows that n n , C U U , defined as where U 1 and U 2 are uniformly distributed random variables on [ ] 0,1 .Therefore, if we assume for moment that Y n converges in distribution to some Y, we obtain Here τ 1 and τ 2 are uniformly distributed random variables on [ ] 0,1 and C is a map defined as We have to refer to Roesler (see in [4] [21] and [22]) for the main idea for the next lemma.

Lemma 1
The map : T D D → is a contraction on ( ) , D d and has a unique fixed point.Moreover, every sequence

Figure 2 .
Figure 2. Comparison between the classical Quicksort and the Dual-pivot Quicksort.

= 1 nU
refers to the equality in distribution.The array is partitioned into three subarrays one with 1 − keys smaller than the first pivot,

. 2 . denotes the 2 L
U U Y Y and Y * are independent.Y * and Y have the same distribution as Y. Finally we show that n Y converges in fact to the fixed point Y.Let D be the space of distribution functions F with finite second moments We use the Wasserstein metric [4] on D. norm.Defining a map :

using
Banach fixed point theorem completes the proof (also see[13]).
this recurrence is transformed to a telescoping one