Batch Gradient Descent vs. Stochastic Gradient Descent

梯度下降法(Gradient Descent)是用于最小化代价函数的方法。

When $a \ne 0$, there are two solutions to \(ax^2 + bx + c = 0\) and they are
$$x = {-b \pm \sqrt{b^2-4ac} \over 2a}.$$
posted @ 2014-08-21 13:06  普兒  阅读(337)  评论(0编辑  收藏  举报