Quick Sort

Overview#

  • Time Complexity: O(nlogn) in best case, O(n^2) in worst case
  • Space Complexity: O(log(n) in best case, O(n) in worst case
  • In-Place
  • Not Stable

General Idea#

Partion Algorithm#

Quick sort works by partitioning. We take an element as a pivot and then compare each of the other elements with the value of the pivot. If they are lesser, we swap them. Else, we leave them. To do this, we have two pointers i and j.

Example: 9, 0, 8, 2, 1, 7

  • Let's take 7 as the pivot, initialise j to point to index 0 (element 9), and i to currently have a value of -1.
  • Compare 7 with 9. 9 is larger, so we leave it. We increment j.
  • Compare 7 with 0. 0 is smaller, so we increment i. i now points to index 0. We perform a swap of the elements at indexes i and j. We increment j. The array will now look like: 0, 9, 8, 2, 1, 7.
  • Compare 7 with 8. 8 is larger, so we leave it. We increment j.
  • Compare 7 with 2. 2 is smaller, so we increment i. i now points to index 1. We perform a swap of elements at indexes i and j. We increment j. The array will now look like: 0, 2, 8, 9, 1, 7.
  • Compare 7 with 1. 1 is smaller, so we increment i. i now points to index 2. We perform a swap of elements at indexes i and j. We increment j. The array will now look like: 0, 2, 1, 9, 8, 7.
  • We reached 7, which is the pivot. We swap 7 with index (i + 1). We do this because, all elements from i to 0 (backwards) are less than 7. All elements from (i + 1) to j, are more than 7. Array now looks like: 0, 2, 1, 7, 9, 8.

In each iteration, we get one element sorted. Notice that 7, the pivot, ends up in its rightful place. The elements on the left of 7 are less than 7 while those on the right are greater than 7. However, within those 2 left and right "sections", the elements need to be sorted which requires partitioning on both sides.

Note: The chosen pivot can be any element, need not be the first or last element.

Quick Sort Algorithm#

The partition algorithm should return the index of the element which was sorted. Now, we need to repeat the partitioning in a recursive manner to the left and right of the sorted element.

This is a divide and conquer algorithm.

Time Complexity#

As described, the pointer j is running from the start of the array till the (n - 1)th element which means O(n) for the partitioning algorithm.

Best Case#

Each call to quick sort algorithm makes 3 function calls:

  • partition
  • quicksort for left ~ 1/2 n
  • quicksort for right ~ 1/2 n

This can be represented as T(n) = 2 * T(n / 2) + O(n) = O(n * log(n)).

Worst Case#

Each call to quick sort algorithm makes 3 function calls:

  • partition
  • quicksort for left ~ 0 elements
  • quicksort for right ~ (n - 1) elements

Such a scenario occurs (assuming we are always taking the last element in the array as the pivot), when the array is already sorted. Example: 1, 5, 6, 10. In each iteration, the pivot stays in the same place and the quick sort is called on 0 elements and (n - 1) elements subsequently. We are going to make n-1 nested calls for each element.

Hence, whenever the pivot ends up as the first or last element in the array, we get the worst case scenario for the partitioning.

This can be represented as T(n) = T(n - 1) + O(n) = T(n - 1) + c(n) = c + c2 + c3 + ... + cn = O(n^2).

Space Complexity#

The recursion stack is dependent on the depth of the recursion tree (i.e.: how many levels do we have).

In the best case, we get balanced partitioning, the total height of the tree is log(n).

In the worse case, we get unbalanced partitioning such that one partition always ends up with one element, and the rest of the elements in the other partition. This means the height of the tree can grow to O(n).

On an average case, space complexity is around O(log(n)).

Implementation#

def quick_sort(arr, p=None, r=None):
if(p is None or r is None):
quick_sort(arr, 0, len(arr) - 1)
elif(p < r):
# q is the pivot index we just partitioned
q = partition(arr, p, r)
quick_sort(arr, p, q - 1)
quick_sort(arr, q + 1, r)
def partition(arr, p, r):
# take last boundary element in arr as pivot
pivot = arr[r]
# i is the pointer which tracks where the pivot should land finally
i = p - 1
for j in range(p, r):
if(arr[j] < pivot):
i += 1
temp = arr[j]
arr[j] = arr[i]
arr[i] = temp
# put pivot in correct place
temp = arr[i+1]
arr[i+1] = arr[r]
arr[r] = temp
# return current index of pivot
return i + 1
arr = [9,21,2,3,2,1,22,2,3,2,1]
print(arr)
quick_sort(arr)
print(arr)
public class QuickSort {
public static void quickSort(int[] arr) {
quickSort(arr, 0, arr.length - 1);
}
// i and j are the pointers specifying the bound for the current quickSort
private static void quickSort(int[] arr, int p, int r) {
if(p < r) {
// q is the pivot's index we just partition (i.e. sorted)
int q = partition(arr, p, r);
// repeat quickSort on the left section of the pivot
quickSort(arr, p, q - 1);
// repeat quickSort on the right section of the pivot
quickSort(arr, q + 1, r);
}
}
public static void printArray(int[] a) {
for(int i = 0; i < a.length; i++) {
System.out.print(a[i] + " ");
}
System.out.println();
}
// p and r are the boundaries of the array we are applying the partition on
private static int partition(int[] arr, int p, int r) {
// take the last element as the pivot
int x = arr[r];
// initialise i to be one index behind
int i = p - 1;
for(int j = p; j < r; j++) {
// if the current element is smaller than the pivot we swap it
if(arr[j] < x) {
i++;
int temp = arr[j];
arr[j] = arr[i];
arr[i] = temp;
}
}
// for-loop is done. we need to put the pivot in the right place
// exchange elements
int temp = arr[i+1];
arr[i+1] = arr[r];
arr[r] = temp;
// i + 1 is the current index of the pivot
return i + 1;
}
public static void main(String[] args) {
int[] a = {7, 12, 3, 5, -6, 3, 8, 2, 10, -3};
printArray(a);
quickSort(a);
printArray(a);
}
}

Comparison with Merge Sort#

If you have a smaller input (perhaps 100 elements), quicksort runs "faster" as compared to mergesort due to reduced number of function calling.

Also, quick sort is in-place whereas merge sort is not.

Some Scenarios#

Ascending Array#

Each time, the pivot element is left unchanged and partition is called on the left section for n - 1 elements. Time complexity is O(n^2).

Descending Array#

Each time, the pivot element ends up on the leftmost side of the "section" we are partitioning. The next 2 paritions is of length 0 and n-1 respectively. Time complexity is O(n^2).

Array with all Elements of Same Value#

Each time, the pivot will remain at its same position. Time complexity is O(n^2).

Unbalanced Division#

No matter, how we divide the sections, perhaps with ratios of 1:3, 1:99, 1:999, we end up with a time complexity of O(n * log(n)).

Example: We always divide with a ratio of 1:4.

Time complexity can be represented as T(n) = O(n) + T(n/5) + T(4/5).

This is bounded by: T(n) <= O(n) + 2 * T(4n/5) which can be simplified to O(n * log(n)).