Sorting Algorithms: A Refresher
- mosesg1123
- Mar 28
- 3 min read
Updated: 7 days ago
Sorting is a fundamental topic in computer science and a common subject in technical coding interviews. Whether you're a new engineer brushing up on core concepts or an experienced one looking for a quick review, this guide covers the essential sorting algorithms, their time complexities, and when to use them.
1. Bubble Sort
Time Complexity: O(n²) worst and average case, O(n) best case (already sorted)
Space Complexity: O(1) (in-place)
Stable: Yes
When to Use: Rarely in practice due to inefficiency but useful for learning purposes.
How It Works:
Repeatedly swap adjacent elements if they are in the wrong order. The largest elements "bubble up" to the end of the list.
2. Selection Sort
Time Complexity: O(n²) for all cases
Space Complexity: O(1) (in-place)
Stable: No
When to Use: When memory is very limited and swapping is costly.
How It Works:
Find the smallest element in the unsorted portion and swap it with the first unsorted element.
3. Insertion Sort
Time Complexity: O(n²) worst and average case, O(n) best case (already sorted)
Space Complexity: O(1) (in-place)
Stable: Yes
When to Use: When the array is small or nearly sorted.
How It Works:
Build the sorted array one element at a time by picking the next element and inserting it in the correct position.
4. Merge Sort
Time Complexity: O(n log n) for all cases
Space Complexity: O(n) (not in-place)
Stable: Yes
When to Use: When stability matters and extra memory usage is not a concern.
How It Works:
Divide the array into halves, recursively sort each half, and merge the sorted halves back together.
5. Quick Sort
Time Complexity: O(n²) worst case (rare), O(n log n) average and best case
Space Complexity: O(log n) (in-place recursive calls)
Stable: No
When to Use: When fast sorting is required, and space constraints are tight.
How It Works:
Pick a pivot element, partition the array into elements less than and greater than the pivot, and recursively sort both partitions.
6. Heap Sort
Time Complexity: O(n log n) for all cases
Space Complexity: O(1) (in-place)
Stable: No
When to Use: When you need an efficient, in-place sorting algorithm without recursion.
How It Works:
Convert the array into a max heap, repeatedly extract the maximum element, and rebuild the heap.
7. Counting Sort
Time Complexity: O(n + k) (where k is the range of input values)
Space Complexity: O(k)
Stable: Yes
When to Use: When sorting integers within a known range.
How It Works:
Count occurrences of each element, compute prefix sums, and place elements in the correct position.
8. Radix Sort
Time Complexity: O(nk) (where k is the number of digits in the largest number)
Space Complexity: O(n + k)
Stable: Yes
When to Use: When sorting large integers or strings.
How It Works:
Sort elements by each digit from least significant to most significant using a stable sort like counting sort.
Choosing the Right Sorting Algorithm
Scenario | Best Sorting Algorithm |
Small or nearly sorted array | Insertion Sort |
Large array, stability required | Merge Sort |
Large array, space efficiency | Quick Sort, Heap Sort |
Sorting integers with a known range | Counting Sort, Radix Sort |
Final Thoughts
Sorting algorithms form the backbone of many computational problems. Understanding their trade-offs and when to use each one is crucial for interview success. Practice implementing these algorithms and analyzing their time and space complexities to sharpen your problem-solving skills!
Need more practice? Try implementing Quick Sort and Merge Sort from scratch!
Comments