Fisher Information - Matrix Form

Matrix Form

When there are N parameters, so that θ is a N × 1 vector \theta = \begin{bmatrix} \theta_{1}, \theta_{2}, \dots, \theta_{N} \end{bmatrix}^{\mathrm T}, then the Fisher information takes the form of an N × N matrix, the Fisher Information Matrix (FIM), with typical element


{\left(\mathcal{I} \left(\theta \right) \right)}_{i, j}
=
\operatorname{E}
\left[\left. \left(\frac{\partial}{\partial\theta_i} \log f(X;\theta)\right) \left(\frac{\partial}{\partial\theta_j} \log f(X;\theta)\right)
\right|\theta\right].

The FIM is a N × N positive semidefinite symmetric matrix, defining a Riemannian metric on the N-dimensional parameter space, thus connecting Fisher information to differential geometry. In that context, this metric is known as the Fisher information metric, and the topic is called information geometry.

Under certain regularity conditions, the Fisher Information Matrix may also be written as


{\left(\mathcal{I} \left(\theta \right) \right)}_{i, j}
=
- \operatorname{E}
\left[\left. \frac{\partial^2}{\partial\theta_i \, \partial\theta_j} \log f(X;\theta)
\right|\theta\right]\,.

The metric is interesting in several ways; it can be derived as the Hessian of the relative entropy; it can be understood as a metric induced from the Euclidean metric, after appropriate change of variable; in its complex-valued form, it is the Fubini-Study metric.

Read more about this topic:  Fisher Information

Famous quotes containing the words matrix and/or form:

    As all historians know, the past is a great darkness, and filled with echoes. Voices may reach us from it; but what they say to us is imbued with the obscurity of the matrix out of which they come; and try as we may, we cannot always decipher them precisely in the clearer light of our day.
    Margaret Atwood (b. 1939)

    Here form is content, content is form.
    Samuel Beckett (1906–1989)