斯坦福大学机器学习(Andrew Ng@2014)--自学笔记

今天学习Andrew NG老师《机器学习》之6 - 6 - Advanced Optimization,做笔记如下:

用fminunc函数求代价函数最小值,分两步:

1、自定义代价函数

function [jVal,gradient] = costFunction(theta)
jVal = (theta(1)-5)^2 + (theta(2)-5)^2;
gradient = zeros(2,1);
gradient(1) = 2*(theta(1)-5);
gradient(2) = 2*(theta(2)-5);
end;

2、设置参数并调用fminunc
>> options = optimset('GradObj','on','MaxIter','100');
>> intialTheta = zeros(2,1);
[optTheta, functionVal, exitFlag] = fminunc(@costFunction, intialTheta, options)

另附老师讲课的截图如下:

-----------------------------------------------------------------------------------------------------------

----------------------------------------------------------------------------------神经网络分类的cost函数:

---------------------------------------------------------------------------------------------------------------

backpropagation algorithm

posted @ 2015-11-18 20:21  Bob.Guo  阅读(247)  评论(0)    收藏  举报