随笔分类 - Deep Learning
摘要:## [VGG](http://www.robots.ox.ac.uk/~vgg/publications/)- [Andrea Vedaldi](http://www.robots.ox.ac.uk/~vedaldi/)## [Berkeley](http://bvlc.eecs.berkeley...
阅读全文
摘要:- vanishing gradient problem- multi-dimensional lstm
阅读全文
摘要:- `cross entropy` loss is not quite the same as optimizing classification accuracy. Althougth the two are correlated.- It's not necessarily true tha...
阅读全文
摘要:Andrew Ng 组的Tutorial做的很浅显易懂啊,今天明白了Autoencoder。其实,autoencoder做的就是降维,我觉得最让我眼睛一亮的地方是,用KL divergence(\ref{kl})做约束实现sparsity,相当于把$\rho$跟$\hat{\rho}$都看成是一种分布,统计的思想体现的很好,跟L1 norm有异曲同工之妙,而且,我感觉解析性应该比L1 norm更好!\begin{equation}\label{kl}\mathbf{ KL}(\rho || \hat\rho_j) = \rho \log \frac{\rho}{\hat\rho_j} + (1
阅读全文
浙公网安备 33010602011771号