Constructing A Classifier From The Probability Model
The discussion so far has derived the independent feature model, that is, the naive Bayes probability model. The naive Bayes classifier combines this model with a decision rule. One common rule is to pick the hypothesis that is most probable; this is known as the maximum a posteriori or MAP decision rule. The corresponding classifier is the function defined as follows:
Read more about this topic: Naive Bayes Classifier
Famous quotes containing the words constructing, probability and/or model:
“The very hope of experimental philosophy, its expectation of constructing the sciences into a true philosophy of nature, is based on induction, or, if you please, the a priori presumption, that physical causation is universal; that the constitution of nature is written in its actual manifestations, and needs only to be deciphered by experimental and inductive research; that it is not a latent invisible writing, to be brought out by the magic of mental anticipation or metaphysical mediation.”
—Chauncey Wright (18301875)
“Crushed to earth and rising again is an authors gymnastic. Once he fails to struggle to his feet and grab his pen, he will contemplate a fact he should never permit himself to face: that in all probability books have been written, are being written, will be written, better than anything he has done, is doing, or will do.”
—Fannie Hurst (18891968)
“Socrates, who was a perfect model in all great qualities, ... hit on a body and face so ugly and so incongruous with the beauty of his soul, he who was so madly in love with beauty.”
—Michel de Montaigne (15331592)