Support Vector Machine - Soft Margin

Soft Margin

In 1995, Corinna Cortes and Vladimir N. Vapnik suggested a modified maximum margin idea that allows for mislabeled examples. If there exists no hyperplane that can split the "yes" and "no" examples, the Soft Margin method will choose a hyperplane that splits the examples as cleanly as possible, while still maximizing the distance to the nearest cleanly split examples. The method introduces slack variables, which measure the degree of misclassification of the data

The objective function is then increased by a function which penalizes non-zero, and the optimization becomes a trade off between a large margin and a small error penalty. If the penalty function is linear, the optimization problem becomes:

subject to (for any )

This constraint in (2) along with the objective of minimizing can be solved using Lagrange multipliers as done above. One has then to solve the following problem:

\min_{\mathbf{w},\mathbf{\xi}, b } \max_{\boldsymbol{\alpha},\boldsymbol{\beta} }
\left \{ \frac{1}{2}\|\mathbf{w}\|^2
+C \sum_{i=1}^n \xi_i
- \sum_{i=1}^{n}{\alpha_i}
- \sum_{i=1}^{n} \beta_i \xi_i \right \}

with .

Read more about this topic:  Support Vector Machine

Famous quotes containing the words soft and/or margin:

    I’ll not be made a soft and dull-eyed fool
    To shake the head, relent, and sigh, and yield
    To Christian intercessors.
    William Shakespeare (1564–1616)

    Everything that explains the world has in fact explained a world that does not exist, a world in which men are at the center of the human enterprise and women are at the margin “helping” them. Such a world does not exist—never has.
    Gerda Lerner (b. 1920)