高级优化

高级优化可以自动选择学习率

 

 

 
Octave中操作

>> function[jVal,gradient] = costFunction(theta)
jVal = (theta(1)-5)^2+(theta(2)-5)^2;
gradient = zeros(2,1);
gradient(1) = 2*(theta(1) - 5);
gradient(2) = 2*(theta(2) - 5);
end
>> options = optimset('Gradobj','on','MaxIter',100);
>> initialTheta = zeros(2,1);
>> [optTheta,functionVal,exitFlag] = fminunc(@costFunction,initialTheta,options);
>> [optTheta,functionVal,exitFlag] = fminunc(@costFunction,initialTheta,options)
optTheta =

5
5

functionVal = 7.8886e-31
exitFlag = 1
>>

costFunction是代价函数。

 

 

 
 
posted @ 2022-07-16 12:26  Guser  阅读(34)  评论(0编辑  收藏  举报