Likelihood Lower Bound
Given some set of hidden variables and observed variables, the goal of approximate inference is to lower-bound the probability that a graphical model is in the configuration . Over some probability distribution (to be defined later),
- .
So, if we define our lower bound to be
- ,
then the likelihood is simply this bound plus the relative entropy between and . Because the relative entropy is non-negative, the function defined above is indeed a lower bound of the log likelihood of our observation . The distribution will have a simpler character than that of because marginalizing over is intractable for all but the simplest of graphical models. In particular, VMP uses a factorized distribution :
where is a disjoint part of the graphical model.
Read more about this topic: Variational Message Passing
Famous quotes containing the words likelihood and/or bound:
“What likelihood is there of corrupting a man who has no ambition?”
—Samuel Richardson (16891761)
“After which you led me to water
And bade me drink, which I did, owing to your kindness.
You would not let me out for two days and three nights,
Bringing me books bound in wild thyme and scented wild grasses
As if reading had any interest for me ...”
—John Ashbery (b. 1927)