Support Vector Machine

Optimal margin classifier has constraint: $y^{(i)}(w^Tx^{(i)}+b) >= 1$

L1 regularization form has constraint: $y^{(i)}(w^Tx^{(i)}+b) >= 1- \zeta_i$

These constraints are the condition of not causing training error.

It means that when these constraints are satiesfied, there wouldn't be an error. Because the left part is fuctional margin.

& if objective: $min\frac{1}{2}\left \| w \right \|^2$ or $min\frac{1}{2}\left \| w \right \|^2+C\sum_{i=1}^m\zeta_i$ is achieved, the the confidence would be maximzed.

In conclusion, the constraints make sure of no training error, while under such conditions the objective is to maximze confidence.

posted @ 2012-10-02 23:17  sidereal  Views(139)  Comments(0)    收藏  举报