Likelihood Lower Bound
Given some set of hidden variables and observed variables, the goal of approximate inference is to lower-bound the probability that a graphical model is in the configuration . Over some probability distribution (to be defined later),
- .
So, if we define our lower bound to be
- ,
then the likelihood is simply this bound plus the relative entropy between and . Because the relative entropy is non-negative, the function defined above is indeed a lower bound of the log likelihood of our observation . The distribution will have a simpler character than that of because marginalizing over is intractable for all but the simplest of graphical models. In particular, VMP uses a factorized distribution :
where is a disjoint part of the graphical model.
Read more about this topic: Variational Message Passing
Famous quotes containing the words likelihood and/or bound:
“Sustained unemployment not only denies parents the opportunity to meet the food, clothing, and shelter needs of their children but also denies them the sense of adequacy, belonging, and worth which being able to do so provides. This increases the likelihood of family problems and decreases the chances of many children to be adequately prepared for school.”
—James P. Comer (20th century)
“The essence of the modern state is that the universal be bound up with the complete freedom of its particular members and with private well-being, that thus the interests of family and civil society must concentrate themselves on the state.... It is only when both these moments subsist in their strength that the state can be regarded as articulated and genuinely organized.”
—Georg Wilhelm Friedrich Hegel (17701831)