Shell Sort

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Defination:

Shell sort is an algorithm that first sorts the elements far apart from each other
and successively reduces the interval between the elements to be sorted. It is a
generalized version of insertion sort.

In shell sort, elements at a specific interval are sorted. The interval between the
elements is gradually decreased based on the sequence used. the performance
of the shell sort depends on the type of sequence used for a given input array.

Some of the optimal sequences used are:

 Shell’s original sequence: N/2 , N/4 , …, 1


 Knuth’s increments: 1, 4, 13, …, (3k – 1) / 2
 Sedgewick’s increments: 1, 8, 23, 77, 281, 1073, 4193, 16577...4j+1+ 3·2j+ 1.
 Hibbard’s increments: 1, 3, 7, 15, 31, 63, 127, 255, 511…
 Papernov & Stasevich increment: 1, 3, 5, 9, 17, 33, 65,...
 Pratt: 1, 2, 3, 4, 6, 9, 8, 12, 18, 27, 16, 24, 36, 54, 81....

How Shell Sort Works?

1. Suppose, we need to sort the following array.

2. We are using the shell’s original sequence (N/2, N/4, ...1) as intervals in our
algorithm.

In the first loop, if the array size is N = 8 then, the elements lying at the interval
of N/2 = 4 are compared and swapped if they are not in order.
a. The 0th element is compared with the 4th element.
b. If the 0th element is greater than the 4th one then, the 4th element is first stored
in temp variable and the 0th element (ie. greater element) is stored in the 4th
position and the element stored in temp is stored in the 0th position.
This process goes on for all the remaining elements.

3. In the second loop, an interval of N/4 = 8/4 = 2 is taken and again the
elements lying at these intervals are sorted.
You might get confused at this point.

The elements at 4th and 2nd position are compared. The elements at 2nd and 0th
position are also compared. All the elements in the array lying at the current
interval are compared.

4. The same process goes on for remaining elements.

5. Finally, when the interval is N/8 = 8/8 =1 then the array elements lying at the
interval of 1 are sorted. The array is now completely sorted.
Shell Sort Algorithm:
1. shellSort(array, size)
2. for interval i <- size/2n down to 1
3. for each interval "i" in array
4. sort all the elements at interval "i"
5. end shellSort
ShellSort Analysis:

Stability:

Shellsort (with Hibbard increments or others) is not stable. It can be readily


demonstrated with an array of size 4 (the smallest possible). Instability is to be
expected because the increment-based sorts move elements distances without
examining of elements in between.

Shellsort has O(n*log(n)) best case time:


The best case, like insertion sort, is when the array is already sorted. Then the
number of comparisons for each of the increment-based insertion sorts is the
length of the array. Therefore we can determine:

comparisons =
n, for 1 sort with elements 1-apart (last step)
+ 3 * n/3, for 3 sorts with elements 3-apart (next-to-last step)
+ 7 * n/7, for 7 sorts with elements 7-apart
+ 15 * n/15, for 15 sorts with elements 15-apart
+ ...
Each term is n. The question is how many terms are there? The number of terms
is the value k such that
2k - l < n
So k < log(n+1), meaning that the sorting time in the best case is less than n *
log(n+1) = O(n*log(n)).

Shellsort worst case time is no worse than quadratic:


The argument is similar as previous, but with a different overall computation. In a
very worst-case scenario (which doesn't exist), each sort would be quadratic
time.
comparisons ≤
n2, for 1 sort with elements 1-apart (last step)
+ 3 * (n/3)2, for 3 sorts with elements 3-apart (next-to-last step)
+ 7 * (n/7)2, for 7 sorts with elements 7-apart
+ 15 * (n/15)2, for 15 sorts with elements 15-apart
+ ...
And so, with a bit of arithmetic, we can see that the number of comparisons is
bounded by:

n2 * (1 + 1/3 + 1/7 + 1/15 + 1/31 + ...)

< n2 * (1 + 1/2 + 1/4 + 1/8 + 1/16 + ...)

= n2 * 2

The last step uses the sum of the geometric series.

Shellsort worst and average times:


The point about shell sort is that the initial sorts, acting on elements at larger
increments apart involve fewer elements, even considering a worst-case
scenario. At the larger increments, "far-away" elements are swapped so that the
number of inversions is dramatically reduced. At the later sorts using smaller
increments, the behavior then comes closer to optimal.
It can be proved that the worst-case time is sub-quadratic at O(n3/2) = O(n1.5). As
can be expected, the proof is quite difficult. The textbook remarks that the
average case time is unknown although conjectured to be O(n5/4) = O(n1.25). The
textbook also mentions other increment sequences which have been studied and
seen to produce even better performance.

Algorithmic comparisons:
Sorting algorithms provide the ability for one to impress another computer
scientist with his or her knowledge of algorithmic understanding. First of
all, O(n*log(n)) acts as a lower bound to how quickly we can sort using
a comparison-based sorting algorithm (we can sort faster than that in certain
special cases).

Of the algorithms which share the same order class, a second consideration is
then the value of the big-O order constant. Even though it can have quadratic
worst-case time, Quicksort is often considered the fastest algorithm on random
arrays; the implication is that it has the smaller order constant, than, say
Heapsort.

The Shellsort algorithm discussed in this document uses the so-called Hibbard
increments (see below). Although Shellsort is not theoretically as fast as others,
it is still comparable in speed when arrays are not huge and enhanced sorting
increments are employed.
References:
https://www.cs.wcupa.edu/rkline/ds/shell-comparison.html
https://www.programiz.com/dsa/shell-
sorthttps://www.cs.wcupa.edu/rkline/ds/shell-comparison.html

You might also like