Time Complexity Overview

Time complexity is a critical concept when evaluating the performance of sorting algorithms. It measures the amount of time an algorithm takes to run as a function of the length of the input list. Here’s an overview of time complexity in some common sorting algorithms, particularly in Python:

AlgorithmBestAverageWorstMemoryStableMethods
Bubble SortO(n)O(n^2)O(n^2)O(1)YesComparison-based, Exchange
Selection SortO(n^2)O(n^2)O(n^2)O(1)NoComparison-based, Selection
Insertion SortO(n)O(n^2)O(n^2)O(1)YesComparison-based, Insertion
Merge SortO(n log n)O(n log n)O(n log n)O(n)YesDivide and Conquer, Merging
Quick SortO(n log n)O(n log n)O(n^2)O(log n)NoDivide and Conquer, Partition
Heap SortO(n log n)O(n log n)O(n log n)O(1)NoComparison-based, Heap
Shell SortO(n log n)O(n(log n)^2)O(n(log n)^2)O(1)NoComparison-based, Insertion
Radix SortO(nk)O(nk)O(nk)O(n + k)YesNon-comparison, Distribution
Counting SortO(n + k)O(n + k)O(n + k)O(k)YesNon-comparison, Counting
Bucket SortO(n + k)O(n + k)O(n^2)O(n)YesNon-comparison, Distribution
Tim SortO(n)O(n log n)O(n log n)O(n)YesHybrid (Merge + Insertion)
Comb SortO(n^2 / 2^p)O(n^2 / 2^p)O(n^2)O(1)NoComparison-based, Exchange
Pigeonhole SortO(n + k)O(n + k)O(n + k)O(n + k)YesNon-comparison, Distribution
Cycle SortO(n^2)O(n^2)O(n^2)O(1)NoComparison-based, In-place
Cocktail Shaker SortO(n)O(n^2)O(n^2)O(1)YesComparison-based, Exchange
Gnome SortO(n)O(n^2)O(n^2)O(1)YesComparison-based, Insertion
Odd-Even SortO(n)O(n^2)O(n^2)O(1)YesComparison-based, Parallel
Bitonic SortO(log^2 n)O(log^2 n)O(log^2 n)O(n log^2 n)YesComparison-based, Parallel
Bogo SortO(n)O((n+1)!)O((n+1)!)O(1)NoRandom-based, Brute-force
Stooge SortO(n^(log 3/log 1.5))O(n^(log 3/log 1.5))O(n^(log 3/log 1.5))O(n)NoComparison-based, Recursive