In this Introduction to Sorting Algorithms Article, we’ll be covering all the important and commonly used Sorting Algorithms. We’ll explain each one of them briefly, their worst and best scenarios, Big O notations and then finally compare their performance.

Not all of them directly compete with each other, some being better under certain circumstances or having certain niches uses. Generally speaking though, some sorting algorithms are just better than others. Though they are usually much harder to understand than the simpler ones. Complexity for faster speeds is a fair enough trade however, especially in performance-critical tasks.

*This is just an introduction article, that aims to briefly touch on many topics. For the detailed explanations with worked examples, refer to the individual tutorials for each algorithm with the included links. *

*You can also check out our YouTube series on Sorting Algorithms!*

### Sorting Algorithm Terminology

**Big O Notation:** A special kind of notation used to represent time complexity and growth patterns of Algorithms. Common examples of Big O Notation are O(n^{2}) and O(n.logn). The “O” is part of the Big O Notation format.

**In-place Sorting:** Sorting that takes place without allocating any extra memory, typically within the same array/list. Typically done by swapping values. Examples of such algorithms are Quicksort and Insertion Sort.

**Out-of-place Sorting:** Sorting Algorithms that need extra memory while sorting. They typically create a new array/list into which the output is sorted.

**Recursive:** This is when a function calls itself within it’s body. Typically Recursion takes extra memory due to the stacking of recursive calls in the stack memory. However, modern Compilers also have optimizations that automatically that can convert Recursive code, to iterative during compile time, which removes the extra memory issue. Recursive solutions are popular due to their short and effective ways of sorting.

**Iterative:** Uses Regular loops to do the sorting. Fairly standard way of sorting.

*You can learn more about Sorting Algorithms and it’s various types in this article. *

### Bubble sort

**Methodology:** The simplest kind of sort, that is often taught to beginners. It involves swapping two values in a list if the preceding value is greater than the next one. Two (nested) for loops are set up, which iterate through the entire list of values, making the above comparison between two values, and swapping them if required. After an `n - 1`

number of passes, (where n is length of array) the array/list becomes fully sorted.

**(Average)** **Big O notation:** n^{2}

**Type of Sort:** In-place

*Learn more about the Bubble Sort Algorithm!*

### Quick Sort

**Methodology:** As the name implies, it’s a way of quickly sorting a list of values. Typically done with a recursive solution, Quick sort uses the concept of a pivot around which the values are sorted. The pivot is any arbitrary value picked out from the list of values (usually the first value). We then “figuratively” divide the list into two parts around the pivot.

After a series of complex comparisons and swapping, we recursively call the QuickSort function on each half of the list (with the pivot as the middle value).

Quicksort can be rather unstable (worst case scenario), but this can be mostly avoided using Randomized Quicksort which minimizes the change of encountering the worst case. Randomized Quicksort involves either shuffling or picking a ranom pivot in an attempt to find a value near the median value. The median value of an array is the most ideal pivot for the Quick Sort Algorithm.

**Type of Sort:** In-place

**(Average)** **Big O notation:** n * log(n)

*Learn more about the QuickSort Algorithms + Code. *

### Insertion sort

The Insertion sort algorithm works off the concept of dividing the array into two parts. The first part is sorted, while the second is not. The first step in any Insertion sort algorithm is to treat the first element in the array, as the first part. Basically you can envision a separating line between the first element, and the rest of the values.

We then begin inserting values from the unsorted part, into the sorted part. We do so by picking the first element of the unsorted part, and pushing it back into the end (after the last element in the sorted part. If the new element is less than the last value in the sorted part, we move it back once more.

We keep moving it like this until the value behind it is greater than the new inserted value. We repeat this process till there are no elements left in the unsorted part.

**Type of Sort:** In-Place

**(Average)** **Big O notation:** n^{2}

*Learn more about the Bubble Sort Algorithm + Code!*

### Selection Sort

The Selection Sort Algorithm is one of the simpler Algorithms out there that you can use for sorting purposes. It uses a very simple logic pattern, which involves finding the minimum value of the items being sorted, and then moving it to the start of the of the list. In the next iteration, we narrow our range to exclude the first (now sorted) minimum value and proceed to locate the next minimum. We then place this minimum at the start of our search range, which means it’s after the first minimum value we found. This process continues till all values have been sorted.

**Type of Sort:** In-Place

**(Average)** **Big O notation:** O(n^{2})

*Learn more about the Selection Sort Algorithm + Code!*

### Merge Sort

**Methodology:** A rather unique type of sort which takes two lists (if you have one, then just split it around the center) and then simultaneously merges them in a sorted manner. The list is basically continuously halved repeatedly until only one element remains. These individuals elements are then compared to each other, and then placed in a new list, starting from the smallest element.

Despite being more stable than sorts like quicksort (better in worst case scenarios), Merge sort uses more memory as it needs to create a new array/list for the final values. (It doesn’t sort in place)

**Type of Sort:** Out of Place

**(Average) Big O notation** = n * log(n)

*Learn more about the Merge Sort Algorithm + Code!*

### Bucket Sort

Bucket Sort has several unique features. For one, it requires the use of another sorting algorithm to completely sort the data. Secondly, it’s also not a comparison based algorithm. Thirdly, it’s very similar to Radix Sort, and the initial steps are actually the same.

The basic premise of Bucket Sort, is to create a certain number of buckets, and then distribute all the values (according to a small formula), into the buckets. Another Sorting Algorithm is then applied to each one of these Buckets, which in reality are just 1D lists. The sorting process is pretty fast as it’s much faster to sort many smaller lists than one large list. (This is because sorting algorithms have time complexities like n^{2} which increase drastically with `n`

values)

**Type of Sort:** Out-of-Place

**(Average)** **Big O notation:** O(n + k)

*Learn more about the BucketSort Algorithm + Code!*

### Radix Sort

**Methodology:** One interesting thing about Radix Sort is that unlike most the sorting algorithms, it is not comparison based. Meaning, that there are no direct comparisons carried out amongst the values being sorted. It uses the concept of “Passes” based on the number of digits of the max number, and the creation of Buckets. The number of Buckets creation depend on the type of data being sorted. 10 Buckets for numbers, and 26 for characters.

In Radix sort, we sort digit by digit, rather than numbers. In the first pass, the unit digit (first from the right) is found and the number is placed in the correct bucket (0 – 9). The second pass, the are placed in Buckets based of the ten-digit (second from the right) and so on.

**Type of Sort:** In-Place

**(Average)** **Big O notation:** O(n*k)

*Learn more about the Radix Sort Algorithm + Code!*

### Heap Sort

A Max Heap is a type of data structure where we create a binary-tree like pattern, comprising of nodes which have two children each. The rule is that each node must be larger than it’s children. The node at the top is known as the root node, as it’s the root (parent) of everything in the heap. Before beginning the actual sorting, we swap the values in the list until we get a perfect Max heap.

The Heap Sort algorithm uses the Max Heap concept to sort the array. This is possible as we know the first element in the array, also known as the root node is the largest. By continuously moving the root nodes to the end of the array, and recreating the heap (while ignoring this value),

**Type of Sort:** In-Place

**(Average)** **Big O notation:** n * logn

*Learn more about the Heap Sort Algorithm + Code!*

### Counting Sort Algorithm

Another one of the non-comparison based sorting Algorithms.

It makes several passes over the list of numbers through out the sorting process. The first pass is required to find the max number in the list. We then use this number to generate a list of size of max number + 1. We then count the number of times instances of each number that appears in the array, and store this count value in the list we just created. After completing the counts array, we iterate over the whole thing, cumulatively summing up all the count values inside it.

Finally, by using a little formula we are able to correctly determine the index of each number, within a newly created array of size n, where n is the number of digits in the original array. This new array, is a fully sorted version of the original array of unsorted numbers.

**Type of Sort:** In-Place

**(Average)** **Big O notation:** O(n+k)

*Learn more about the Counting Sort Algorithm + Code!*

### Sorting Comparison (Table)

A table that show’s the time complexities for some of the most commonly used Sorting Algorithms. Time complexity is the first thing that you need to be checking when comparing two sorting algorithms. The lower the time complexity, the better.

Sorting Algorithm | Average Case | Best Case | Worst Case |
---|---|---|---|

Bubble Sort | O(n^{2}) | O(n) | O(n^{2}) |

Insertion Sort | O(n^{2}) | O(n) | O(n^{2}) |

Selection Sort | O(n^{2}) | O(n^{2}) | O(n^{2}) |

Quick Sort | O(n.log(n)) | O(n.log(n)) | O(n^{2}) |

Merge Sort | O(n.log(n)) | O(n.log(n)) | O(n.log(n)) |

Heap Sort | O(n.log(n)) | O(n.log(n)) | O(n.log(n)) |

Counting Sort | O(n+k) | O(n+k) | O(n+k) |

Radix Sort | O(n*k) | O(n*k) | O(n*k) |

Bucket Sort | O(n+k) | O(n+k) | O(n^{2}) |

There are many other factors on the basis of which Sorting Algorithms are compared, such as Space Complexity, Stable or Unstable, in-place or out-of-place etc. For a more detailed comparison, check out our separate dedicated article on Comparison of Sorting Algorithms, which also includes actual real world tests to compare speed.

You can also use this site to visually see the computing time difference and sorting process between all the types of sorting mentioned in this article. Give it a look as it will help improve your concepts.

This marks the end of the Introduction to Sorting Algorithm article. Any suggestions or contributions for CodersLegacy are more than welcome. Questions regarding the tutorial content can be asked in the comments section below.