Soft Margin
In 1995, Corinna Cortes and Vladimir N. Vapnik suggested a modified maximum margin idea that allows for mislabeled examples. If there exists no hyperplane that can split the "yes" and "no" examples, the Soft Margin method will choose a hyperplane that splits the examples as cleanly as possible, while still maximizing the distance to the nearest cleanly split examples. The method introduces slack variables, which measure the degree of misclassification of the data
The objective function is then increased by a function which penalizes non-zero, and the optimization becomes a trade off between a large margin and a small error penalty. If the penalty function is linear, the optimization problem becomes:
subject to (for any )
This constraint in (2) along with the objective of minimizing can be solved using Lagrange multipliers as done above. One has then to solve the following problem:
with .
Read more about this topic: Support Vector Machine
Famous quotes containing the words soft and/or margin:
“I brush my hair,
waiting in the pain machine for my bones to get hard,
for the soft, soft bones that were laid apart
and were screwed together. They will knit.
And the other corpse, the fractured heart,
I feed it piecemeal, little chalice. Im good to it.”
—Anne Sexton (19281974)
“Will not a tiny speck very close to our vision blot out the glory of the world, and leave only a margin by which we see the blot? I know no speck so troublesome as self.”
—George Eliot [Mary Ann (or Marian)