Support Vector Machine - Soft Margin

Soft Margin

In 1995, Corinna Cortes and Vladimir N. Vapnik suggested a modified maximum margin idea that allows for mislabeled examples. If there exists no hyperplane that can split the "yes" and "no" examples, the Soft Margin method will choose a hyperplane that splits the examples as cleanly as possible, while still maximizing the distance to the nearest cleanly split examples. The method introduces slack variables, which measure the degree of misclassification of the data

The objective function is then increased by a function which penalizes non-zero, and the optimization becomes a trade off between a large margin and a small error penalty. If the penalty function is linear, the optimization problem becomes:

subject to (for any )

This constraint in (2) along with the objective of minimizing can be solved using Lagrange multipliers as done above. One has then to solve the following problem:

\min_{\mathbf{w},\mathbf{\xi}, b } \max_{\boldsymbol{\alpha},\boldsymbol{\beta} }
\left \{ \frac{1}{2}\|\mathbf{w}\|^2
+C \sum_{i=1}^n \xi_i
- \sum_{i=1}^{n}{\alpha_i}
- \sum_{i=1}^{n} \beta_i \xi_i \right \}

with .

Read more about this topic:  Support Vector Machine

Famous quotes containing the words soft and/or margin:

    But Thou that know’st Love above Intrest or lust
    Strew the Myrtle and Rose on this once belov’d Dust
    And shed one pious tear upon Jinny the Just
    Tread soft on her Grave, and do right to her honor
    Let neither rude hand no ill Tongue light upon her
    Do all the smal Favors that now can be done her
    Matthew Prior (1664–1721)

    Will not a tiny speck very close to our vision blot out the glory of the world, and leave only a margin by which we see the blot? I know no speck so troublesome as self.
    George Eliot [Mary Ann (or Marian)