Insertion Sort

Overview#

  • Time Complexity: Best - O(n), Worst - O(n^2)
  • Space Complexity: O(1)
  • In-Place
  • Stable

General Idea#

Think of playing cards. We usually hold the cards in a sorted manner (ascending order from left to right).

Note: At any point in time, the cards on the left of a particular card will all be sorted.

When we pick a new card, we need to check where exactly to place this card among the existing cards. Insertion sort works the same way.

Example: Current array: 6, 3, 4, 1, 8, 0

  • Consider just the first element, 6. This alone is already sorted since there is just one element.
  • Pick up the next card - 3. Check if 3 is greater than the first element , 6. Since it is smaller, we move 6 into the 2nd index. Now we have reached the start of the array (index 0), so we place 3 there. Current order: 3, 6, 4, 1, 8, 0
  • Pick up next card - 4. Check if 4 is smaller than 6. It is smaller, so we swap 6 and 4's positions. Check if 4 is smaller than 3. It is not, so we insert 4 into the index 1. Current order: 3, 4, 6, 1, 8 ,0

Repeat the above for the other elements, always comparing backwards to the left of the element in concern.

Time Complexity#

Worst Case#

Worst case scenario occurs when the array is reverse sorted as this will mean the most number of comparisons and shifting.

Consider this:

We skip element in index 0.

For the first element, which is in index 1, you need to (in the worst case), compare once and move-copy the element once which yields 2 operations.

For the second element, which is in index 2, you need to (in the worst case), compare twice and move-copy twice, which yield 4 operations.

Hence, this can be generalised to say that for an element at the nth index, we need to do 2(n-1) operations in the worse case.

Summing these operations up, i.e.: 2(1 + 2 + … + (n-1)), we get n^2 - n. This is reduced to the order of O(n^2) in the worst case.

Best Case#

The best case occurs when the array is already sorted in the order we want. This means we are eliminating the "moving and copying" phase entirely. We will only be doing comparisons.

We skip element in index 0.

For the first element (in index 1), we do 1 compare. For the second element (in index 2), since we know that the element in index 1 is already sorted and so should be the element in index 0.

Summing this up, we obtain a total of 1 + 1 + ... + 1 operations which is in total (n-1). We have done (n-1) comparisons and zero movement of the elements.

Hence, best case time complexity simplifies to Omega(n), Ω(n).

Space Complexity#

No matter the size of the array n, we only need one temporary variable to hold the current element being compared and 2 integers for the for loops - i.e. 3 variables. Since this is a constant number, space complexity is of order O(1) - constant time.

Implementation#

def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i-1
while(j>=0 and key < arr[j]):
arr[j+1] = arr[j]
j -= 1
arr[j+1] = key
return arr

Possible Enhancements & Limitations#

  • When comparing, we realise that the left side of the array is always sorted. This allows for the possibility of using binary search (O(log n)) instead of linear search O(n). However, the number of movements still remains in the order of O(n).
  • Using a doubly linked list. Here, the movement just takes O(1) since we just need to directly insert any element by creating and linking a new node. But, now the search for the correct place of insertion remains as O(n) as we cannot apply binary search on linked list.