Naive Bayes Classifier - Discussion

Discussion

Despite the fact that the far-reaching independence assumptions are often inaccurate, the naive Bayes classifier has several properties that make it surprisingly useful in practice. In particular, the decoupling of the class conditional feature distributions means that each distribution can be independently estimated as a one dimensional distribution. This helps alleviate problems stemming from the curse of dimensionality, such as the need for data sets that scale exponentially with the number of features . While naive Bayes often fails to produce a good estimate for the correct class probabilities, this may not be a requirement for many applications. For example, the naive Bayes classifier will make the correct MAP decision rule classification so long as the correct class is more probable than any other class. This is true regardless of whether the probability estimate is slightly, or even grossly inaccurate. In this manner, the overall classifier can be robust enough to ignore serious deficiencies in its underlying naive probability model. Other reasons for the observed success of the naive Bayes classifier are discussed in the literature cited below.

Read more about this topic:  Naive Bayes Classifier

Famous quotes containing the word discussion:

    If we had had more time for discussion we should probably have made a great many more mistakes.
    Leon Trotsky (1879–1940)

    There exist few things more tedious than a discussion of general ideas inflicted by author or reader upon a work of fiction.
    Vladimir Nabokov (1899–1977)

    Bigotry is the disease of ignorance, of morbid minds; enthusiasm of the free and buoyant. Education and free discussion are the antidotes of both.
    Thomas Jefferson (1743–1826)