Data Structures and Algorithms
|
7.3 Quick Sort (cont)
|
Quick Sort - The Facts!
Quick Sort is generally the best known sorting algorithm,
but it has a serious limitation,
which must be understood before using it in certain applications.
What happens if we apply the qsort
routine on the previous
page to an already sorted array?
This is certainly a case where we'd expect the performance to be quite
good!
However, the first attempt to partition the problem into two problems
will return an empty lower partition - the first element is the smallest.
Thus the first partition call simply chops off one element and calls
qsort for a partition with n-1 items!
This happens a further n-2 times!
Each partition call still requires O(n) operations - and
we have generated O(n) such calls.
In the worst case, quicksort is an O(n2)
algorithm!
|
Can we do anything about this?
A number of variations to the simple quicksort will generally produce
better results: rather than choose the first item as the pivot,
some other strategies work better.
Median-of-3 Pivot
For example, the median-of-3 pivot approach
selects three candidate pivots and uses the median one.
If the three pivots are chosen from the first, middle and last
positions, then it is easy to see that for the already sorted array,
this will produce an optimum result:
each partition will be exactly half (±one element) of the problem
and we will need exactly ceiling(logn) recursive calls.
Random pivot
Some qsort's will simply use a randomly chosen pivot.
This also works fine for sorted arrays - on average the pivot will
produce two equal sized partitions and there will be
O(logn) of them.
However, whatever strategy we use for choosing the pivot,
it is possible to propose a pathological case
in which the
problem is not divided equally at any partition stage.
Thus quicksort must always be treated as potentially O(n2).
|
Why bother with quicksort then?
Heap sort is always O(nlog n):
why not just use it?
© , 1998