Analysis of Algorithms
The iterated logarithm is useful in analysis of algorithms and computational complexity, appearing in the time and space complexity bounds of some algorithms such as:
- Finding the Delaunay triangulation of a set of points knowing the Euclidean minimum spanning tree: randomized O(n log* n) time, Olivier Devillers
- Fürer's algorithm for integer multiplication: O(n log n 2lg* n)
- Finding an approximate maximum (element at least as large as the median): lg* n − 4 to lg* n + 2 parallel operations
- Richard Cole and Uzi Vishkin's distributed algorithm for 3-coloring an n-cycle: O(log* n) synchronous communication rounds.
The iterated logarithm grows at an extremely slow rate, much slower than the logarithm itself. For all values of n relevant to counting the running times of algorithms implemented in practice (i.e., n ≤ 265536, which is far more than the atoms in the known universe), the iterated logarithm with base 2 has a value no more than 5.
x | lg* x |
---|---|
(−∞, 1] | 0 |
(1, 2] | 1 |
(2, 4] | 2 |
(4, 16] | 3 |
(16, 65536] | 4 |
(65536, 265536] | 5 |
Higher bases give smaller iterated logarithms. Indeed, the only function commonly used in complexity theory that grows more slowly is the inverse Ackermann function.
Read more about this topic: Iterated Logarithm
Famous quotes containing the words analysis of and/or analysis:
“Cubism had been an analysis of the object and an attempt to put it before us in its totality; both as analysis and as synthesis, it was a criticism of appearance. Surrealism transmuted the object, and suddenly a canvas became an apparition: a new figuration, a real transfiguration.”
—Octavio Paz (b. 1914)
“A commodity appears at first sight an extremely obvious, trivial thing. But its analysis brings out that it is a very strange thing, abounding in metaphysical subtleties and theological niceties.”
—Karl Marx (18181883)