包含对三种支持向量机的介绍,包括线性可分支持向量机,线性支持向量机和非线性支持向量机,包含核函数和一种快速学习算法-序列最小最优化算法SMO。
线性可分支持向量机与硬间隔最大化
线性可分支持向量机
假设一个特征空间上线性可分的训练数据集\(T = \{(x_1,y_1),(x_2,y_2),\cdots,(x_N,y_N)\}\),其中\(x_i \in \mathcal{X} = R^n,y_i \in \mathcal{Y} = \{+1,-1\},i = 1,2,\cdots,N\)
线性可分支持向量机利用间隔最大化求最优分离超平面,解是唯一的
分离超平面:
\[\omega^* \cdot x + b^* = 0
\]
相应的分类决策函数:
\[f(x) = sign(\omega^* \cdot x + b^*)
\]
函数间隔和几何间隔
- 超平面\((\omega,b)\)关于训练数据集\(T\)中样本点\((x_i,y_i)\)的函数间隔functional margin:
\[\hat{\gamma}_i = y_i(\omega \cdot x_i + b)
\]
- 超平面关于训练数据集的函数间隔:所有关于样本点的函数间隔的最小值:
\[\hat{\gamma} = \mathop{min}\limits_{i = 1,\cdots,N}\hat{\gamma}_i
\]
- 对超平面的法向量\(\omega\)规范化使得\(||\omega|| = 1\),\(||\omega||\)是\(\omega\)的\(L_2\)范数,这时函数间隔变为几何间隔geometric margin,关于样本点的几何间隔为:
\[\gamma_i = y_i\left(\frac{\omega}{||\omega||} \cdot x_i + \frac{b}{||\omega||}\right)
\]
\[\gamma = \mathop{min}\limits_{i = 1,\cdots,N}\gamma_i
\]
- 如果\(||\omega|| = 1\),那么函数间隔和几何间隔相等
间隔最大化
求几何间隔最大的分离超平面,这个问题表示为下面的约束最优化问题,表示每个样本点的几何间隔最小是\(\gamma\):
\[\begin{aligned}
\mathop{max}\limits_{\omega,b}\ &\gamma \\
s.t.\ &y_i\left(\frac{\omega}{||\omega||} \cdot x_i + \frac{b}{||\omega||}\right) \geq \gamma,i = 1,2,\cdots,N
\end{aligned}
\]
将几何间隔转化为函数间隔:
\[\begin{aligned}
\mathop{max}\limits_{\omega,b}\ &\frac{\hat{\gamma}}{||\omega||} \\
s.t.\ &y_i(\omega \cdot x_i + b) \geq \hat{\gamma},i = 1,2,\cdots,N
\end{aligned}
\]
而已知\(\hat{\gamma}\)的取值不影响最优化问题的解,取其值为1,于是得到线性可分支持向量机学习的最优化问题:
\[\begin{aligned}
\mathop{min}\limits_{\omega,b}\ &\frac{1}{2}||\omega||^2 \\
s.t.\ &y_i(\omega \cdot x_i + b) - 1 \geq 0,i = 1,2,\cdots,N
\end{aligned}
\]
凸优化问题是指约束最优化问题:
\[\begin{aligned}
\mathop{min}\limits_\omega\ &f(\omega) \\
s.t.\ &g_i(\omega) \leq 0,i = 1,2,\cdots,k \\
&h_i(\omega) = 0,i = 1,2,\cdots,l
\end{aligned}
\]
其中目标函数\(f(\omega)\)和约束函数\(g_i(\omega)\)都是\(R^n\)上连续可微的凸函数,约束函数\(h_i(\omega)\)是\(R^n\)上的仿射函数(\(f(x)\)称为仿射函数,如果满足\(f(x) = a \cdot x + b,a \in R^n,b \in R, x \in R^n\))
当目标函数\(f(\omega)\)是二次函数且约束函数\(g_i(\omega)\)是仿射函数时,上述凸优化问题成为凸二次规划问题
线性可分支持向量机学习算法-最大间隔法
输入:线性可分的训练数据集\(T = \{(x_1,y_1),(x_2,y_2),\cdots,(x_N,y_N)\}\),其中\(x_i \in \mathcal{X} = R^n,y_i \in \mathcal{Y} = \{+1,-1\},i = 1,2,\cdots,N\)
输出:最大间隔分离超平面和分类决策函数
- 构造并求解约束最优化问题,得到最优解\(\omega^*,b^*\):
\[\begin{aligned}
\mathop{min}\limits_{\omega,b}\ &\frac{1}{2}||\omega||^2 \\
s.t.\ &y_i(\omega \cdot x_i + b) - 1 \geq 0,i = 1,2,\cdots,N
\end{aligned}
\]
\[\omega^* \cdot x + b^* = 0
\]
分类决策函数:
\[f(x) = sign(\omega^* \cdot x + b^*)
\]
若训练数据集\(T\)线性可分,则可将训练数据集中的样本点完全正确分开的最大间隔分离超平面存在且唯一p117页证明(回忆不起来可看)
支持向量:是指约束条件不等式等号成立的样本点\(y_i(\omega \cdot x_i + b) - 1 = 0\)
间隔:依赖于分离超平面的法向量,等于\(\frac{2}{||\omega||}\)
学习的对偶算法
首先引入拉格朗日乘子\(\alpha_i \geq 0,i = 1,2,\cdots,N\)定义拉格朗日函数,\(\alpha = (\alpha_1,\alpha_2,\cdots,\alpha_N)^T\)为拉格朗日乘子向量:
\[L(\omega,b,\alpha) = \frac{1}{2}||\omega||^2 - \sum_{i = 1}^N\alpha_iy_i(\omega \cdot x_i + b) + \sum_{i = 1}^N\alpha_i
\]
对偶问题是极大极小问题,先求极小再求极大:
\[\mathop{max}\limits_\alpha \mathop{min}\limits_{\omega,b}L(\omega,b,\alpha)
\]
- 求\(\mathop{min}\limits_{\omega,b}L(\omega,b,\alpha)\),对\(\omega,b\)求偏导
\[\nabla_\omega L(\omega,b,\alpha) = \omega - \sum_{i = 1}^N\alpha_iy_ix_i = 0 \\
\nabla_b L(\omega,b,\alpha) = -\sum_{i = 1}^N\alpha_iy_i = 0
\]
得到:
\[\omega = \sum_{i = 1}^N\alpha_iy_ix_i \\
\sum_{i = 1}^N\alpha_iy_i = 0
\]
将上式带到拉格朗日函数:
\[\begin{aligned}
L(\omega,b,\alpha)
&= \frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) - \sum_{i = 1}^N\alpha_iy_i\left(\left(\sum_{j = 1}^N\alpha_jy_jx_j\right) \cdot x_i + b\right) + \sum_{i = 1}^N\alpha_i \\
&= -\frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) + \sum_{i = 1}^N\alpha_i
\end{aligned}
\]
所以:
\[\mathop{min}\limits_{\omega,b}L(\omega,b,\alpha) = -\frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) + \sum_{i = 1}^N\alpha_i
\]
- 求\(\mathop{min}\limits_{\omega,b}L(\omega,b,\alpha)\)对\(\alpha\)的极大:
\[\begin{aligned}
\mathop{max}\limits_{\alpha}\ &-\frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) + \sum_{i = 1}^N\alpha_i \\
s.t.\ &\sum_{i = 1}^N\alpha_iy_i = 0 \\
& \alpha_i \geq 0,i = 1,2,\cdots,N
\end{aligned}
\]
将上式的目标函数由求极大转换为求极小,得到下面和上式等价的对偶最优化问题
\[\begin{aligned}
\mathop{min}\limits_{\alpha}\ &\frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) - \sum_{i = 1}^N\alpha_i \\
s.t.\ &\sum_{i = 1}^N\alpha_iy_i = 0 \\
& \alpha_i \geq 0,i = 1,2,\cdots,N
\end{aligned}
\]
因为原始问题满足原始问题和对偶问题最优值相同所需的条件,所以存在\(\omega^*,\alpha^*,\beta^*\)分别是原始问题的解和对偶问题的解,求解原始问题可以转化为求解对偶问题
- 假设对偶问题解为\(\alpha^* = (\alpha^*_1,\alpha^*_2,\cdots,\alpha_N^*)^T\),存在下标\(j\),使得\(\alpha_j^* \gt 0\),可以由\(\alpha^*\)求得原始问题的解\(\omega^*,b^*\)
\[\omega^* = \sum_{i = 1}^N\alpha_i^*y_ix_i \\
b^* = y_j - \sum_{i = 1}^N\alpha_i^*y_i(x_i \cdot x_j)
\]
证明如下:
由KKT条件:
\[\nabla_\omega L(\omega^*,b^*,\alpha^*) = \omega^* - \sum_{i = 1}^N\alpha^*_iy_ix_i = 0
\]
得到:
\[\omega^* = \sum_{i = 1}^N\alpha_i^*y_ix_i
\]
至少存在一个\(a^*_j \gt 0\),这是因为假设所有的都为0,那么\(\omega^* = 0\),这不是原始最优化问题的解,所以至少存在一个,对这个\(j\)结合KKT条件\(\alpha^*_i(y_i(\omega^* \cdot x_i + b^*) - 1) = 0\),得到\(y_j(\omega^* \cdot x_j + b^*) - 1 = 0\),又有\(y^2_j = 1\),得到:
\[y_j = \omega^* \cdot x_j + b^* \\
b^* = y_j - \omega^* \cdot x_j \\
b^* = y_j - \sum_{i = 1}^N\alpha_i^*y_i(x_i \cdot x_j)
\]
至此\(\omega^*,b^*\)求出
分离超平面为:
\[\sum_{i = 1}^N\alpha_i^*y_i(x \cdot x_i) + b^* = 0
\]
分类决策函数为:
\[f(x) = sign\left(\sum_{i = 1}^N\alpha_i^*y_i(x \cdot x_i) + b^*\right)
\]
线性可分支持向量机学习算法-对偶
输入:线性可分的训练数据集\(T = \{(x_1,y_1),(x_2,y_2),\cdots,(x_N,y_N)\}\),其中\(x_i \in \mathcal{X} = R^n,y_i \in \mathcal{Y} = \{+1,-1\},i = 1,2,\cdots,N\)
输出:最大间隔分离超平面和分类决策函数
- 构造并求解约束最优化问题求得最优解\(\alpha^* = (\alpha^*_1,\alpha^*_2,\cdots,\alpha_N^*)^T\)
\[\begin{aligned}
\mathop{min}\limits_{\alpha}\ &\frac{1}{2}\sum_{i = 1}^N\sum_{j = 1}^N\alpha_i\alpha_jy_iy_j(x_i \cdot x_j) - \sum_{i = 1}^N\alpha_i \\
s.t.\ &\sum_{i = 1}^N\alpha_iy_i = 0 \\
& \alpha_i \geq 0,i = 1,2,\cdots,N
\end{aligned}
\]
\[\omega^* = \sum_{i = 1}^N\alpha_i^*y_ix_i
\]
选择\(\alpha^*\)的一个分量\(\alpha_j^* \gt 0\),计算:
\[b^* = y_j - \sum_{i = 1}^N\alpha_i^*y_i(x_i \cdot x_j)
\]
\[\sum_{i = 1}^N\alpha_i^*y_i(x \cdot x_i) + b^* = 0 \\
f(x) = sign\left(\sum_{i = 1}^N\alpha_i^*y_i(x \cdot x_i) + b^*\right)
\]
训练数据中对应\(\alpha^*_i \gt 0\)的样本点\((x_i,y_i)\)的\(x_i\)称为支持向量,根据KKT条件得到:\(y_i(\omega^* \cdot x_i + b^*) - 1 = 0\),支持向量\(x_i\)一定在间隔边界上