神经网络的学习 Neural Networks learing
1.一些基本符号
2.COST函数
================Backpropagation Algorithm=============
1.要计算的东西
![](http://img-my.csdn.net/uploads/201302/08/1360287661_9760.png)
2.向前传递向量图,但为了计算上图的偏导,要用到后向传递算法
![](http://img-my.csdn.net/uploads/201302/08/1360288380_5861.png)
3.后向传递算法
![](http://img-my.csdn.net/uploads/201302/08/1360289837_2848.png)
![](http://img-my.csdn.net/uploads/201302/08/1360290349_2747.png)
4.小题目
![](http://img-my.csdn.net/uploads/201302/08/1360290592_8862.png)
==============Backpropagation Intuition==============
1.前向计算与后向计算很类似
![](http://img-my.csdn.net/uploads/201302/08/1360303791_9559.png)
2.仅考虑一个例子,COST函数化简
![](http://img-my.csdn.net/uploads/201302/08/1360304035_3064.png)
3.倒着算theta
![](http://img-my.csdn.net/uploads/201302/08/1360304589_4715.png)
=======Implementation Note: Unrolling Parameters=======
1.参数的展开
![](http://img-my.csdn.net/uploads/201302/08/1360306972_1270.png)
2.学习算法
![](http://img-my.csdn.net/uploads/201302/08/1360307271_1026.png)
============Gradient Checking====================
1.梯度的数字计算
![](http://img-my.csdn.net/uploads/201302/08/1360308451_8919.png)
2.所有的梯度的近似计算
![](http://img-my.csdn.net/uploads/201302/08/1360308632_9597.png)
![](http://img-my.csdn.net/uploads/201302/08/1360308843_4503.png)
3.回退计算而不是梯度计算的本质原因
![](http://img-my.csdn.net/uploads/201302/08/1360310572_2352.png)
4.实现注意点
![](http://img-my.csdn.net/uploads/201302/08/1360310625_8308.png)
============Random Initialization=============
1.zero initial 对神经网络不适合
![](http://img-my.csdn.net/uploads/201302/08/1360312970_4725.png)
2.随机对称初始化
![](http://img-my.csdn.net/uploads/201302/08/1360313186_2803.png)
==========Putting It Together==============
1.隐含层越多,则计算量越大。隐含层节点相同比较好。
![](http://img-my.csdn.net/uploads/201302/09/1360373142_6515.png)
2.训练神经网络的步骤,其实跟回归很像。关键是用后退算偏导。
![](http://img-my.csdn.net/uploads/201302/09/1360373729_4414.png)
![](http://img-my.csdn.net/uploads/201302/09/1360374039_7863.png)