Loss及其梯度
Typical Loss
- Mean Squared Error
- Cross Entropy Loss
- binary
- multi-class
- +softmax
MSE
- \(loss=\sum[y-(x*w+b)]^2\)
- \(L_2-norm=||y-(x*w+b)||_2\)
- \(loss=norm(y-(w*x+b))^2\)
\(loss=\sum[y-f_\theta(x)]^2\)
\(\frac{\nabla loss}{\nabla\theta}=2\sum[y-f_\theta(x)]*\frac{\nabla f_\theta (x)}{\nabla\theta}\)

浙公网安备 33010602011771号