Linearithmic/quasilinear Time
A linearithmic function (portmanteau of linear and logarithmic) is a function of the form n · log n (i.e., a product of a linear and a logarithmic term). An algorithm is said to run in linearithmic time if T(n) = O(n log n). Compared to other functions, a linearithmic function is ω(n), o(n1+ε) for every ε > 0, and Θ(n · log n). Thus, a linearithmic term grows faster than a linear term but slower than any polynomial in n with exponent strictly greater than 1.
An algorithm is said to run in quasilinear time if T(n) = O(n logk n) for any constant k. Quasilinear time algorithms are also o(n1+ε) for every ε > 0, and thus run faster than any polynomial in n with exponent strictly greater than 1.
In many cases, the n · log n running time is simply the result of performing a Θ(log n) operation n times. For example, binary tree sort creates a binary tree by inserting each element of the n-sized array one by one. Since the insert operation on a self-balancing binary search tree takes O(log n) time, the entire algorithm takes linearithmic time.
Comparison sorts require at least linearithmic number of comparisons in the worst case because log(n!) = Θ(n log n), by Stirling's approximation. They also frequently arise from the recurrence relation T(n) = 2 T(n/2) + O(n).
Some famous algorithms that run in linearithmic time include:
- Comb sort, in the average and worst case
- Quicksort in the average case
- Heapsort, merge sort, introsort, binary tree sort, smoothsort, patience sorting, etc. in the worst case
- Fast Fourier transforms
- Monge array calculation
Read more about this topic: Time Complexity
Famous quotes containing the word time:
“So much of our time is preparation, so much is routine, and so much retrospect, that the pith of each mans genius contracts itself to a very few hours.”
—Ralph Waldo Emerson (18031882)