Statistical Introduction
Given data and parameter, a simple Bayesian analysis starts with a prior probability (prior) and likelihood to compute a posterior probability .
Often the prior on depends in turn on other parameters that are not mentioned in the likelihood. So, the prior must be replaced by a likelihood, and a prior on the newly introduced parameters is required, resulting in a posterior probability
This is the simplest example of a hierarchical Bayes model.]? Link to whichever one it is. from October 2009">clarification needed]]]
The process may be repeated; for example, the parameters may depend in turn on additional parameters, which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.
Read more about this topic: Bayesian Network
Famous quotes containing the word introduction:
“For better or worse, stepparenting is self-conscious parenting. Youre damned if you do, and damned if you dont.”
—Anonymous Parent. Making It as a Stepparent, by Claire Berman, introduction (1980, repr. 1986)