Mike Bostock created some great visualizations of sorting algorithms with D3.js and I wondered how hard it would be to implement something similar from scratch.

I generated images for five different algorithms, each sorting an array of 15 numbers in sorted, reversed and random order.

## Bubble Sort

class BubbleSort < Sort
def sort
swapped = true

while swapped
swapped = false
(1...@array.size).each do |i|
if @array[i - 1] > @array[i]
swap(i - 1, i)
swapped = true
end
end
end
end
end

• Best case $\mathcal{O}(n)$
• Average case $\mathcal{O}(n^2)$
• Worst case $\mathcal{O}(n^2)$

### Sorted

If the input array is already sorted, the algorithm is done after one iteration and needs so swaps.

### Reversed

The extreme opposite happens when the array is in reversed order, the first element has to “bubble up” to the last index of the array, the next one to next-to-last, …

## Selection Sort

class SelectionSort < Sort
def sort
(0...(@array.size - 1)).each do |j|
i_min = j
((j + 1)...@array.size).each do |i|
i_min = i if @array[i] < @array[i_min]
end

swap(j, i_min) if i_min != j
end
end
end

• Best case $\mathcal{O}(n^2)$
• Average case $\mathcal{O}(n^2)$
• Worst case $\mathcal{O}(n^2)$

Selection sort iterates over the array, looks for the smallest element in the rest of the array and swaps it to the current position.

### Sorted

Just like bubble sort, no swaps are used for a sorted array.

### Reversed

If the array is in reversed order, the last element is swapped to the first position, the next-to-last to the second position, …, leading to a nice triangle shape.

### Random

Just from looking at these images, it might seem like selection sort is way better than bubble sort because the images are smaller. In the real world, the performance of a sorting algorithms depends on the number of comparisons made, too, while the size of these images depends only on the number of swaps.

## Insertion Sort

class InsertionSort < Sort
def sort
i = 1
while i < @array.size
j = i
while j > 0 && @array[j - 1] > @array[j]
swap(j, j - 1)
j -= 1
end
i += 1
end
end
end

• Best case $\mathcal{O}(n)$
• Average case $\mathcal{O}(n^2)$
• Worst case $\mathcal{O}(n^2)$

## Quick Sort

class QuickSort < Sort
def median_of_three(left, right)
center = (left + right) / 2
[left, right, center].sort_by { |index| @array[index] }[1]
end

def sort(left = 0, right = @array.size - 1)
return unless left < right

pivot = median_of_three(left, right)
center = partition(left, right, pivot)

sort(left, center - 1)
sort(center + 1, right)
end

# Reorder the elements in the array
# so that the elements less that the pivot element are to its left
# and the other ones are to its right,
# then return the new index of the pivot element.
def partition(left, right, pivot)
pivot_value = @array[pivot]
swap(pivot, right)
i = left - 1

(left...right).each do |j|
if @array[j] <= pivot_value
i += 1
swap(i, j)
end
end

swap(i + 1, right)
i + 1
end
end

• Best case $\mathcal{O}(n \log n)$
• Average case $\mathcal{O}(n \log n)$
• Worst case $\mathcal{O}(n^2)$

This is not your garden-variety quicksort, it swaps the elements in-place instead of splitting the input array into two smaller ones, recursively sorting both and merging them again into a big sorted array.

The pivot element is selected by calculating the median of the first, last and center element.

### Sorted

Here you can see how the pivot elements are selected and swapped to the end of the array. Because the elements are already sorted, each one is only swapped with itself, leading to a long section of straight lines. Then the pivot element is swapped to its correct position and stays there for the rest of the steps.

If you look closely, you can see how after that the right half of the array is sorted (recursively), then the left.

### Reversed

If the array is reversed, a lot of swaps are needed to partition the array each time.

### Random

In the average case quick sort needs fewer steps that the previous algorithms. The worst case happens if each step the pivot element the pivot element is the biggest or smallest element of the subarray, so that one of the partitions is empty.

## Heap Sort

This is the most complicated one of the algorithms presented here, but the one with the best worst-case complexity.

Heap sort works by reordering the elements of the array so that they form a heap (a way to store binary trees in arrays) and then searching for the smallest element in logarithmic time.

class HeapSort < Sort
def sort
heapify

to = @array.size - 1
while to > 0
swap(to, 0)
to -= 1
sift_down(0, to)
end
end

def heapify
from = index_parent(@array.size - 1)
while from >= 0
sift_down(from, @array.size - 1)
from -= 1
end
end

def sift_down(from, to)
root = from

while index_left_child(root) <= to
child = index_left_child(root)
swap = @array[root] < @array[child] ? child : root
swap = child + 1 if child + 1 <= to && @array[swap] < @array[child + 1]

return if swap == root

swap(root, swap)
root = swap
end
end

def index_parent(i)
(i - 1) / 2
end

def index_left_child(i)
2 * i + 1
end
end

• Best case $\mathcal{O}(n \log n)$
• Average case $\mathcal{O}(n \log n)$
• Worst case $\mathcal{O}(n \log n)$