摘要:
# Batch gradient descent(批量梯度下降) for i in range(nb_epochs): params_grad = evaluate_gradient(loss_function, data, params) params = params - learning_rate * params_grad # Stochastic gradien... 阅读全文
posted @ 2016-04-05 04:02
罗兵
阅读(977)
评论(0)
推荐(0)
浙公网安备 33010602011771号