Sorting Algorithm - Comparison of Algorithms

Comparison of Algorithms

In this table, n is the number of records to be sorted. The columns "Average" and "Worst" give the time complexity in each case, under the assumption that the length of each key is constant, and that therefore all comparisons, swaps, and other needed operations can proceed in constant time. "Memory" denotes the amount of auxiliary storage needed beyond that used by the list itself, under the same assumption. These are all comparison sorts. The run time and the memory of algorithms could be measured using various notations like theta, omega, Big-O, small-o, etc. The memory and the run times below are applicable for all the 5 notations.

Comparison sorts
Name Best Average Worst
Memory Stable Method
Other notes
Quicksort 20 ! 20 ! 25 ! 05 ! Depends Partitioning Quicksort is usually done in place with O(log(n)) stack space. Most implementations are unstable, as stable in-place partitioning is more complex. Naïve variants use an O(n) space array to store the partition.
Merge sort 20 ! 20 ! 20 ! 15 !Depends; worst case is Yes Merging Highly parallelizable (up to O(log(n))) for processing large amounts of data.
In-place Merge sort 50 ! 50 ! 23 ! 00 ! Yes Merging Implemented in Standard Template Library (STL); can be implemented as a stable sort based on stable in-place merging.
Heapsort 20 ! 20 ! 20 ! 00 ! No Selection
Insertion sort 15 ! 25 ! 25 ! 00 ! Yes Insertion O(n + d), where d is the number of inversions
Introsort 20 ! 20 ! 20 ! 05 ! No Partitioning & Selection Used in several STL implementations
Selection sort 25 ! 25 ! 25 ! 00 ! No Selection Stable with O(n) extra space, for example using lists. Used to sort this table in Safari or other Webkit web browser.
Timsort 15 ! 20 ! 20 ! 15 ! Yes Insertion & Merging comparisons when the data is already sorted or reverse sorted.
Shell sort 15 ! 23 !

or

23 !Depends on gap sequence; best known is 00 ! No Insertion
Bubble sort 15 ! 25 ! 25 ! 00 ! Yes Exchanging Tiny code size
Binary tree sort 15 ! 20 ! 20 ! 15 ! Yes Insertion When using a self-balancing binary search tree
Cycle sort 50 !— 25 ! 25 ! 00 ! No Insertion In-place with theoretically optimal number of writes
Library sort 50 !— 20 ! 25 ! 15 ! Yes Insertion
Patience sorting 50 !— 50 !— 20 ! 15 ! No Insertion & Selection Finds all the longest increasing subsequences within O(n log n)
Smoothsort 15 ! 20 ! 20 ! 00 ! No Selection An adaptive sort - comparisons when the data is already sorted, and 0 swaps.
Strand sort 15 ! 25 ! 25 ! 15 ! Yes Selection
Tournament sort 50 !— 20 ! 20 ! Selection
Cocktail sort 15 ! 25 ! 25 ! 00 ! Yes Exchanging
Comb sort 15 ! 15 ! 25 ! 00 ! No Exchanging Small code size
Gnome sort 15 ! 25 ! 25 ! 00 ! Yes Exchanging Tiny code size
Bogosort 15 ! 45 ! 45 ! 00 ! No Luck Randomly permute the array and check if sorted.
50 ! 20 ! 20 ! 00 ! Yes

The following table describes integer sorting algorithms and other sorting algorithms that are not comparison sorts. As such, they are not limited by a lower bound. Complexities below are in terms of n, the number of items to be sorted, k, the size of each key, and d, the digit size used by the implementation. Many of them are based on the assumption that the key size is large enough that all entries have unique key values, and hence that n << 2k, where << means "much less than."

Non-comparison sorts
Name Best Average Worst
Memory
Stable n << 2k Notes
Pigeonhole sort 03 !— Yes Yes
Bucket sort (uniform keys) 03 !— Yes No Assumes uniform distribution of elements from the domain in the array.
Bucket sort (integer keys) 03 !— Yes Yes r is the range of numbers to be sorted. If r = then Avg RT =
Counting sort 03 !— Yes Yes r is the range of numbers to be sorted. If r = then Avg RT =
LSD Radix Sort 03 !— Yes No
MSD Radix Sort 03 !— Yes No Stable version uses an external array of size n to hold all of the bins
MSD Radix Sort 03 !— No No In-Place. k / d recursion levels, 2d for count array
Spreadsort 03 !— No No Asymptotics are based on the assumption that n << 2k, but the algorithm does not require this.

The following table describes some sorting algorithms that are impractical for real-life use due to extremely poor performance or a requirement for specialized hardware.

Name Best Average Worst Memory Stable Comparison Other notes
Bead sort 03 !— N/A N/A N/A No Requires specialized hardware
Simple pancake sort 03 !— No Yes Count is number of flips.
Spaghetti (Poll) sort 15 ! 15 ! 15 ! 25 ! Yes Polling This A linear-time, analog algorithm for sorting a sequence of items, requiring O(n) stack space, and the sort is stable. This requires parallel processors. Spaghetti sort#Analysis
Sorting networks 03 !— Yes No Requires a custom circuit of size

Additionally, theoretical computer scientists have detailed other sorting algorithms that provide better than time complexity with additional constraints, including:

  • Han's algorithm, a deterministic algorithm for sorting keys from a domain of finite size, taking time and space.
  • Thorup's algorithm, a randomized algorithm for sorting keys from a domain of finite size, taking time and space.
  • An integer sorting algorithm taking expected time and space.

Algorithms not yet compared above include:

  • Odd-even sort
  • Flashsort
  • Burstsort
  • Postman sort
  • Stooge sort
  • Samplesort
  • Bitonic sorter

Read more about this topic:  Sorting Algorithm

Famous quotes containing the word comparison:

    The difference between human vision and the image perceived by the faceted eye of an insect may be compared with the difference between a half-tone block made with the very finest screen and the corresponding picture as represented by the very coarse screening used in common newspaper pictorial reproduction. The same comparison holds good between the way Gogol saw things and the way average readers and average writers see things.
    Vladimir Nabokov (1899–1977)