Quick Sort
Visualize the divide-and-conquer sorting algorithm with in-place partitioning
Array
Current Operation:
Ready to sort. Click "Start" to begin.
Comparisons
0
Swaps
0
Recursion Depth
0
Space Complexity
O(log n)
avg (O(n) worst)
Color Coding:
What is Quick Sort?
Quick Sort is a divide-and-conquer sorting algorithm that works by selecting a 'pivot' element and partitioning the array around it. Developed by Tony Hoare in 1959, it's one of the most efficient sorting algorithms in practice.
How It Works
The algorithm partitions the array and recursively sorts the sub-arrays:
- Select a pivot element (commonly the last element)
- Partition: move all elements smaller than pivot to the left, larger to the right
- Place the pivot in its final sorted position
- Recursively sort the left and right partitions
Implementation
Here's a Python implementation:
from typing import List
def quicksort(arr: List[int], left: int, right: int) -> None:
"""
Sort an array in-place using the Quick Sort algorithm.
Uses divide-and-conquer strategy: selects a pivot, partitions the array around it,
then recursively sorts the partitions. This implementation uses the last element
as the pivot and sorts in-place for O(1) extra space.
Args:
arr: List of integers to sort (modified in-place).
left: Starting index of the subarray to sort (inclusive).
right: Ending index of the subarray to sort (inclusive).
Returns:
None (modifies array in-place).
Time Complexity: O(n log n) average case, O(n²) worst case.
Space Complexity: O(log n) for recursion stack.
"""
if left >= right:
return
# Choose pivot (last element)
pivot_index = right
pivot_value = arr[pivot_index]
# Partition: elements < pivot on left, >= pivot on right
i = left - 1 # Index of smaller element
for j in range(left, right):
if arr[j] < pivot_value:
i += 1
arr[i], arr[j] = arr[j], arr[i]
# Place pivot in correct position
arr[i + 1], arr[right] = arr[right], arr[i + 1]
pivot_final = i + 1
# Recursively sort partitions
quicksort(arr, left, pivot_final - 1)
quicksort(arr, pivot_final + 1, right)
# Example
arr = [64, 34, 25, 12, 22, 11, 90, 5, 77, 50]
quicksort(arr, 0, len(arr) - 1)
print(arr) # [5, 11, 12, 22, 25, 34, 50, 64, 77, 90]Time & Space Complexity
Time Complexity:
- Best/Average: O(n log n)
- Worst: O(n²)
Worst-Case Trigger Conditions
The worst-case O(n²) occurs when the pivot selection consistently creates highly unbalanced partitions:
- Already sorted array with last-element pivot: When the array is already sorted (ascending or descending) and the pivot is always chosen as the last element, each partition only reduces the problem size by 1, leading to n recursive calls of O(n) each.
- All elements equal: If all elements are the same, and pivot selection doesn't handle this case, it can degrade to O(n²).
- Pivot always minimum/maximum: Any pivot selection strategy that consistently picks the smallest or largest element will cause worst-case behavior.
Mitigation strategies: Use randomized pivot selection, median-of-three pivot, or hybrid approaches (switch to Insertion Sort for small subarrays) to avoid worst-case scenarios in practice.
Space Complexity: O(log n) on average for the recursion stack (O(n) in worst case when array is already sorted). Can be optimized to O(1) with iterative implementation or tail recursion.
Complexity Comparison with Other Sorting Algorithms
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
Key Insights: Quick Sort excels in average-case performance and is often faster than Merge Sort in practice due to better cache locality. However, Merge Sort guarantees O(n log n) in all cases and is stable. Heap Sort provides guaranteed O(n log n) with O(1) space but is typically slower in practice. For small arrays, Insertion Sort (O(n) best case) can be more efficient.
Key Advantage: Quick Sort is in-place - it only uses O(1) extra memory for swapping elements, making it very memory-efficient compared to algorithms like Merge Sort.
Real-World Applications
- Standard library implementations: many languages use Quick Sort variants (e.g., C++ std::sort)
- Database systems: efficient sorting of query results
- Operating systems: process scheduling and file system operations
- Data analysis: sorting large datasets efficiently