Welcome to our Comparison on Sorting Algorithms Article. Here, we’ll be comparing the various sorting Algorithms out there on the basic of several factors.

- Time Complexity
- Space Complexity
- Stable/Unstable
- Actual Fields Tests

We’ll top it all of by trying to describe where each Algorithm is best suited, and their strong and weak points. Every Algorithm is unique, and performs best under certain circumstances unique to it.

### Comparison of Time Complexity

A table that show’s the time complexities for some of the most commonly used Sorting Algorithms. Time complexity is the first thing that you need to be checking when comparing two sorting algorithms. The lower the time complexity, the better.

Sorting Algorithm | Average Case | Best Case | Worst Case |
---|---|---|---|

Bubble Sort | O(n^{2}) | O(n) | O(n^{2}) |

Insertion Sort | O(n^{2}) | O(n) | O(n^{2}) |

Selection Sort | O(n^{2}) | O(n^{2}) | O(n^{2}) |

Quick Sort | O(n.log(n)) | O(n.log(n)) | O(n^{2}) |

Merge Sort | O(n.log(n)) | O(n.log(n)) | O(n.log(n)) |

Heap Sort | O(n.log(n)) | O(n.log(n)) | O(n.log(n)) |

Counting Sort | O(n+k) | O(n+k) | O(n+k) |

Radix Sort | O(n*k) | O(n*k) | O(n*k) |

Bucket Sort | O(n+k) | O(n+k) | O(n^{2}) |

We’ve used a color scheme in the table above, to help with our Comparison of Sorting Algorithms. Red is the worst, under which the O(n^{2}) Algorithms lie. The O(n.log(n)) Algorithms are next, which are the middle ground. The best time complexity is O(n), which is the fastest Algorithm can be.

Later when we do the actual field tests, you can use this table as reference. You will notice how much of an impact time complexity has on performance.

### Comparison of Space Complexity

While speed is important and usually your top priority, sometimes in places with memory constraints, Algorithms with low memory costs are preferred.

The below table shows the Space Complexity for the various Sorting Algorithms. You might notice, that the Algorithms with higher space complexities are those which are “out of place” and the ones with the lowest, are in-place. This is of course, because Out of Place Algorithms create extra arrays to store the data in, while In-place uses the same array.

It goes without saying, that the best Space Complexity is O(1).

Sorting Algorithm | Space Complexity |
---|---|

Bubble Sort | O(1) |

Insertion Sort | O(1) |

Selection Sort | O(1) |

Quick Sort | O(log(n)) |

Merge Sort | O(n) |

Heap Sort | O(1) |

Counting Sort | O(k) |

Radix Sort | O(n + k) |

Bucket Sort | O(n) |

### Stable and Unstable Algorithms

This is a rather niche use, and only makes an actual difference in certain types of data. However, it remains an important requirement that is needed for these certain scenarios.

Sorting Algorithm | Stable Sort? |
---|---|

Bubble Sort | Yes |

Insertion Sort | Yes |

Selection Sort | No |

Quick Sort | No |

Merge Sort | Yes |

Heap Sort | No |

Counting Sort | Yes |

Radix Sort | Yes |

Bucket Sort | Yes |

Is it important to note however, that you can usually create stable versions of the above Algorithms. The ones being referred to in the image above, are the “Classic” versions of the Algorithm.

*You can also check out our YouTube series on Sorting Algorithms!*

### Sorting Algorithms – Fields Tests

Finally, we are going to be measuring the main component, performance. We have tested the 9 Algorithms featured here under a variety of circumstances. From 100 numbers to 10,000 as well as tests using already sorted data, these tests will reveal quite a bit.

#### Testing Method

I’ve used Google Collab for running these tests, to ensure a constant and fair testing environment. To be clear, the code for these Sorting Algorithms was written in Python and written in a fairly standard manner. No heavy optimizations were applied, and only standard (classic) versions of the Algorithms were used.

The Python timeit and random library were used to generate the random data, and perform continuous and repeated tests for each Algorithm. This is again, to ensure fair results. The Random library is used to generate numbers from 1 to 10,000 and the timeit library performs each test 5 times in total, and returns a list of all 5 times. We’ve displayed both the max and min values for the 5 tests, so you can see the displacement of times.

Each one of the 5 tests is actually running the code 10 times, (the number parameter, default value 1,000,000). This increases the accuracy by doing alot of tests and adding together the values to average it out. If you want the individual time for one single sort, divide the min/max value by 10. The number of repetitions is controlled by the repeat parameter (default value 5).

You can see the code for the testing function in the code below. If you follow the links for the timeit and random library, you can learn more about what’s going on here.

```
import random
import timeit
import sys
def test():
SETUP_CODE = '''
from __main__ import sort
from random import randint'''
TEST_CODE = '''
array = []
for x in range(1000):
array.append(x)
sort(array)
'''
times = timeit.repeat( setup = SETUP_CODE,
stmt = TEST_CODE,
number = 10,
repeat = 5)
print('Min Time: {}'.format(min(times)))
print('Max Time: {}'.format(max(times)))
```

#### Sorting Algorithms – Performance Comparison

In this section, we are going to conduct three sets of tests. The first will have 100 random numbers, the second will have 1000 and the third will have 10,000. Take a good look at the table, compare the time complexities, and make your own observations. I’ll share my observations right after this.

Sorting Algorithm | Test 1 (100) | Test 2 (1000) | Test 3 (10000) |
---|---|---|---|

Bubble Sort | Min: 0.01008 seconds Max: 0.0206 seconds | Min: 1.0242 seconds Max: 1.0558 seconds | Min: 100.922 seconds Max: 102.475 seconds |

Insertion Sort | Min: 0.00306 seconds Max: 0.00650 seconds | Min: 0.0369 seconds Max: 0.0562 seconds | Min: 100.422 seconds Max: 102.344 seconds |

Selection Sort | Min: 0.00556 seconds Max: 0.00946 seconds | Min: 0.4740 seconds Max: 0.4842 seconds | Min: 40.831 seconds Max: 41.218 seconds |

Quick Sort | Min: 0.00482 seconds Max: 0.01141 seconds | Min: 0.0370 seconds Max: 0.0383 seconds | Min: 0.401 seconds Max: 0.420 seconds |

Merge Sort | Min: 0.00444 seconds Max: 0.00460 seconds | Min: 0.0561 seconds Max: 0.0578 seconds | Min: 0.707 seconds Max: 0.726 seconds |

Heap Sort | Min: 0.00489 seconds Max: 0.00510 seconds | Min: 0.0704 seconds Max: 0.0747 seconds | Min: 0.928 seconds Max: 0.949 seconds |

Counting Sort | Min: 0.01929 seconds Max: 0.02052 seconds | Min: 0.0354 seconds Max: 0.0400 seconds | Min: 0.195 seconds Max: 0.203 seconds |

Radix Sort | Min: 0.00315 seconds Max: 0.00394 seconds | Min: 0.0294 seconds Max: 0.0309 seconds | Min: 0.313 seconds Max: 0.338 seconds |

Bucket Sort | Min: 0.00225 seconds Max: 0.00241 seconds | Min: 0.0335 seconds Max: 0.0369 seconds | Min: 1.854 seconds Max: 1.892 seconds |

I wanted to also include tests for 100,000 and 1,000,000 numbers, but the O(n^{2}) Algorithms were taking forever to complete, so I gave up.

#### Observations

- The O(n
^{2}) Algorithms (Bubble and Insertion Sort) reacted very poorly as the number of tests went up to 10,000. At 10,000 numbers the other Algorithms were on average, over 100x times faster. - On the test cases with just 100 numbers, the O(n
^{2}) Algorithms were faster than the O(n.log(n)) Algorithms. - With every 10x increase in the amount of numbers, the O(n
^{2}) Algorithms completion time increased by 100x. - Radix Sort and Counting Sort were on average, the fastest Algorithms.
- Heapsort is fastest Algorithm with a space complexity of O(1).

#### Sorted Data Comparison

Another very interesting case is when Sorted Data is used, instead of random data. This test is mainly to show which Sorting Algorithms perform with sorted/partially sorted data and which perform worse.

Sorting Algorithm | Sorted Data (1000) |
---|---|

Bubble Sort | Min: 0.542 seconds Max: 0.556 seconds |

Insertion Sort | Min: 0.790 seconds Max: 0.821 seconds |

Selection Sort | Min: 0.434 seconds Max: 0.464 seconds |

Quick Sort | Min: 0.812 seconds Max: 0.872 seconds |

Merge Sort | Min: 0.0289 seconds Max: 0.0364 seconds |

Heap Sort | Min: 0.0604 seconds Max: 0.0661 seconds |

Counting Sort | Min: 0.0055 seconds Max: 0.0124 seconds |

Radix Sort | Min: 0.0119 seconds Max: 0.0145 seconds |

Bucket Sort | Min: 0.0183 seconds Max: 0.0247 seconds |

#### Observations

- Surprise, Surprise. Quick Sort Algorithm doesn’t live up to it’s name, and is the slowest out of all the above algorithms for a 1000 sorted numbers. This is because Quick Sort doesn’t respond well to degenerate cases like this, and requires special optimizations such as “randomized pivots”.
- With the exception of Quick Sort, the time required dropped for all Algorithms.
- Counting Sort performs the best, followed by Radix and Bucket Sort.

This marks the end of the Sorting Algorithms Comparison article. Any suggestions or contributions for CodersLegacy are more than welcome. Questions regarding the tutorial content can be asked in the comments section below.

excellent analysis

Siddiqi, your article has been re-published here on our site: https://www.codeproject.com/Articles/5308420/Comparison-of-Sorting-Algorithms

Has it been with your permission?

Yes, It was done with permission. Thanks for asking.