• 博客园logo
  • 会员
  • 众包
  • 新闻
  • 博问
  • 闪存
  • 赞助商
  • HarmonyOS
  • Chat2DB
    • 搜索
      所有博客
    • 搜索
      当前博客
  • 写随笔 我的博客 短消息 简洁模式
    用户头像
    我的博客 我的园子 账号设置 会员中心 简洁模式 ... 退出登录
    注册 登录
ying_vincent
博客园    首页    新随笔    联系   管理    订阅  订阅

Machine Learning No.4: Regularization

1. Underfit = High bias

  Overfit = High varience

2. Addressing overfitting:

  (1) reduce number of features.

    Manually select which features to keep.

    Model selection algorithm

      disadvantage: throw out some useful information

     (2) Regularization

    Keep all the features, but reduce magnitude/values of parameters θj

    works well when we have a lot of features, each of which contributλes a bit to predicting y.

3. Regularization

if λ is extremely large, , then J(θ) will be underfitting

4. Gradient desent

Repeat {

  

         (j = 1, 2 ... n)

}

5. Normal equation

if λ > 0

if m <= n

is non-invertible/singular

but using regularization will avoid this problem

 

posted @ 2013-06-27 07:45  ying_vincent  阅读(217)  评论(0)    收藏  举报
刷新页面返回顶部
博客园  ©  2004-2025
浙公网安备 33010602011771号 浙ICP备2021040463号-3