# Quick Sort

## #

Overview- Time Complexity:
`O(nlogn)`

in best case,`O(n^2)`

in worst case - Space Complexity:
`O(log(n)`

in best case,`O(n)`

in worst case - In-Place
- Not Stable

## #

General Idea### #

Partion AlgorithmQuick sort works by partitioning. We take an element as a pivot and then compare each of the other elements with the value of the pivot. If they are lesser, we swap them. Else, we leave them. To do this, we have two pointers `i`

and `j`

.

Example: 9, 0, 8, 2, 1, 7

- Let's take 7 as the pivot, initialise
`j`

to point to index 0 (element 9), and`i`

to currently have a value of -1. - Compare 7 with 9. 9 is larger, so we leave it. We increment
`j`

. - Compare 7 with 0. 0 is smaller, so we increment
`i`

. i now points to index 0. We perform a swap of the elements at indexes`i`

and`j`

. We increment`j`

. The array will now look like: 0, 9, 8, 2, 1, 7. - Compare 7 with 8. 8 is larger, so we leave it. We increment
`j`

. - Compare 7 with 2. 2 is smaller, so we increment
`i`

.`i`

now points to index 1. We perform a swap of elements at indexes`i`

and`j`

. We increment`j`

. The array will now look like: 0, 2, 8, 9, 1, 7. - Compare 7 with 1. 1 is smaller, so we increment
`i`

.`i`

now points to index 2. We perform a swap of elements at indexes`i`

and`j`

. We increment`j`

. The array will now look like: 0, 2, 1, 9, 8, 7. - We reached 7, which is the pivot. We swap 7 with index
`(i + 1)`

. We do this because, all elements from`i`

to 0 (backwards) are less than 7. All elements from`(i + 1)`

to`j`

, are more than 7. Array now looks like: 0, 2, 1, 7, 9, 8.

In each iteration, we get one element sorted. Notice that 7, the pivot, ends up in its rightful place. The elements on the left of 7 are less than 7 while those on the right are greater than 7. However, within those 2 left and right "sections", the elements need to be sorted which requires partitioning on both sides.

*Note: The chosen pivot can be any element, need not be the first or last element.*

### #

Quick Sort AlgorithmThe partition algorithm should return the index of the element which was sorted. Now, we need to repeat the partitioning in a recursive manner to the left and right of the sorted element.

This is a divide and conquer algorithm.

## #

Time ComplexityAs described, the pointer `j`

is running from the start of the array till the `(n - 1)`

th element which means `O(n)`

for the partitioning algorithm.

### #

Best CaseEach call to quick sort algorithm makes 3 function calls:

- partition
- quicksort for left ~ 1/2 n
- quicksort for right ~ 1/2 n

This can be represented as `T(n) = 2 * T(n / 2) + O(n) = O(n * log(n))`

.

### #

Worst CaseEach call to quick sort algorithm makes 3 function calls:

- partition
- quicksort for left ~ 0 elements
- quicksort for right ~ (n - 1) elements

Such a scenario occurs (assuming we are always taking the last element in the array as the pivot), when the array is already sorted. Example: 1, 5, 6, 10. In each iteration, the pivot stays in the same place and the quick sort is called on 0 elements and (n - 1) elements subsequently. We are going to make `n-1`

nested calls for each element.

Hence, whenever the pivot ends up as the first or last element in the array, we get the worst case scenario for the partitioning.

This can be represented as `T(n) = T(n - 1) + O(n) = T(n - 1) + c(n) = c + c2 + c3 + ... + cn = O(n^2)`

.

## #

Space ComplexityThe recursion stack is dependent on the depth of the recursion tree (i.e.: how many levels do we have).

In the best case, we get balanced partitioning, the total height of the tree is `log(n)`

.

In the worse case, we get unbalanced partitioning such that one partition always ends up with one element, and the rest of the elements in the other partition. This means the height of the tree can grow to `O(n)`

.

On an average case, space complexity is around `O(log(n))`

.

## #

Implementation## #

Comparison with Merge SortIf you have a smaller input (perhaps 100 elements), quicksort runs "faster" as compared to mergesort due to reduced number of function calling.

Also, quick sort is in-place whereas merge sort is not.

## #

Some Scenarios### #

Ascending ArrayEach time, the pivot element is left unchanged and partition is called on the left section for `n - 1`

elements. Time complexity is `O(n^2)`

.

### #

Descending ArrayEach time, the pivot element ends up on the leftmost side of the "section" we are partitioning. The next 2 paritions is of length 0 and `n-1`

respectively. Time complexity is `O(n^2)`

.

### #

Array with all Elements of Same ValueEach time, the pivot will remain at its same position. Time complexity is `O(n^2)`

.

### #

Unbalanced DivisionNo matter, how we divide the sections, perhaps with ratios of 1:3, 1:99, 1:999, we end up with a time complexity of `O(n * log(n))`

.

Example: We always divide with a ratio of 1:4.

Time complexity can be represented as `T(n) = O(n) + T(n/5) + T(4/5)`

.

This is bounded by: `T(n) <= O(n) + 2 * T(4n/5)`

which can be simplified to `O(n * log(n))`

.