正则化——线性回归

线性回归的代价函数正则化后为

\[J\left( \theta  \right){\rm{ = }}\frac{{\rm{1}}}{{{\rm{2}}m}}\left[ {\sum\limits_{i = 1}^m {{{\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)}^2}}  + \lambda \sum\limits_{j = 1}^n {\theta _j^2} } \right]\]

此时梯度下降算法

重复{

\[{\theta _0}: = {\theta _0} - \alpha \left[ {\frac{1}{m}\sum\limits_{i = 1}^m {\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)x_0^{\left( i \right)}} } \right]\]

\[{\theta _j}: = {\theta _j} - \alpha \left[ {\frac{1}{m}\sum\limits_{i = 1}^m {\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)x_j^{\left( i \right)} + \frac{\lambda }{m}{\theta _j}} } \right]\left( {j = 1,2,...,n} \right)\]

}

此时normal equation为

\[\theta = {\left( {{X^T}X + \lambda \left[ {\begin{array}{*{20}{c}}
0&0&0&0\\
0&1&0&0\\
.&.&.&.\\
0&0&0&1
\end{array}} \right]} \right)^{ - 1}}{X^T}y\]

可以证明,正则化后括号里面的矩阵是可逆的

posted @ 2018-10-28 18:36  qkloveslife  阅读(680)  评论(0编辑  收藏  举报