二、Deep Learning Basics——Lecture 3: Regularization and Optimization
Lecture 3: Regularization and Optimization(正则化和优化)
https://cs231n.github.io/optimization-1/
Regularization(正则化)
Stochastic Gradient Descent(随机梯度下降)
Momentum, AdaGrad, Adam
Learning rate schedules(学习率时间表)
本文来自博客园,作者:JaxonYe,转载请注明原文链接:https://www.cnblogs.com/yechangxin/articles/16532141.html
侵权必究

浙公网安备 33010602011771号