gradient descent & ascent

stochastic gradient descent is to minimize cost function:

$\theta_j := \theta_j - \alpha \frac{\partial}{\partial \theta_j}J(\theta)$

while gradient ascent is to maximize likelihood function:

$\theta_j := \theta_j + \alpha \frac{\partial}{\partial \theta_j}l(\theta)$

posted @ 2012-09-29 10:55  sidereal  Views(283)  Comments(0)    收藏  举报