Log-normal Distribution - Maximum Likelihood Estimation of Parameters

Maximum Likelihood Estimation of Parameters

For determining the maximum likelihood estimators of the log-normal distribution parameters μ and σ, we can use the same procedure as for the normal distribution. To avoid repetition, we observe that

where by ƒL we denote the probability density function of the log-normal distribution and by ƒN that of the normal distribution. Therefore, using the same indices to denote distributions, we can write the log-likelihood function thus:


\begin{align}
\ell_L (\mu,\sigma | x_1, x_2, \dots, x_n) & {} = - \sum _k \ln x_k + \ell_N (\mu, \sigma | \ln x_1, \ln x_2, \dots, \ln x_n) \\
& {} = \operatorname {constant} + \ell_N (\mu, \sigma | \ln x_1, \ln x_2, \dots, \ln x_n).
\end{align}

Since the first term is constant with regard to μ and σ, both logarithmic likelihood functions, L and N, reach their maximum with the same μ and σ. Hence, using the formulas for the normal distribution maximum likelihood parameter estimators and the equality above, we deduce that for the log-normal distribution it holds that

\widehat \mu = \frac {\sum_k \ln x_k} n, \widehat \sigma^2 = \frac {\sum_k \left( \ln x_k - \widehat \mu \right)^2} {n}.

Read more about this topic:  Log-normal Distribution

Famous quotes containing the words maximum, likelihood, estimation and/or parameters:

    I had a quick grasp of the secret to sanity—it had become the ability to hold the maximum of impossible combinations in one’s mind.
    Norman Mailer (b. 1923)

    What likelihood is there of corrupting a man who has no ambition?
    Samuel Richardson (1689–1761)

    A higher class, in the estimation and love of this city- building, market-going race of mankind, are the poets, who, from the intellectual kingdom, feed the thought and imagination with ideas and pictures which raise men out of the world of corn and money, and console them for the short-comings of the day, and the meanness of labor and traffic.
    Ralph Waldo Emerson (1803–1882)

    What our children have to fear is not the cars on the highways of tomorrow but our own pleasure in calculating the most elegant parameters of their deaths.
    —J.G. (James Graham)