Iterated Logarithm - Analysis of Algorithms

Analysis of Algorithms

The iterated logarithm is useful in analysis of algorithms and computational complexity, appearing in the time and space complexity bounds of some algorithms such as:

  • Finding the Delaunay triangulation of a set of points knowing the Euclidean minimum spanning tree: randomized O(n log* n) time, Olivier Devillers
  • Fürer's algorithm for integer multiplication: O(n log n 2lg* n)
  • Finding an approximate maximum (element at least as large as the median): lg* n − 4 to lg* n + 2 parallel operations
  • Richard Cole and Uzi Vishkin's distributed algorithm for 3-coloring an n-cycle: O(log* n) synchronous communication rounds.

The iterated logarithm grows at an extremely slow rate, much slower than the logarithm itself. For all values of n relevant to counting the running times of algorithms implemented in practice (i.e., n ≤ 265536, which is far more than the atoms in the known universe), the iterated logarithm with base 2 has a value no more than 5.

x lg* x
(−∞, 1] 0
(1, 2] 1
(2, 4] 2
(4, 16] 3
(16, 65536] 4
(65536, 265536] 5

Higher bases give smaller iterated logarithms. Indeed, the only function commonly used in complexity theory that grows more slowly is the inverse Ackermann function.

Read more about this topic:  Iterated Logarithm

Famous quotes containing the word analysis:

    ... the big courageous acts of life are those one never hears of and only suspects from having been through like experience. It takes real courage to do battle in the unspectacular task. We always listen for the applause of our co-workers. He is courageous who plods on, unlettered and unknown.... In the last analysis it is this courage, developing between man and his limitations, that brings success.
    Alice Foote MacDougall (1867–1945)