摘要:
1)Deep Learning相比于传统方法的优势
首先,一个很直观的图,随着训练量的提高,传统方法很快走到天花板,而Deep Learning的效果还能持续走高,后来这个在提问环节也有同学问道,是否会一直提高,Andrew Ng也坦诚需要面对不同的问题来讨论,而且任何方法都有天花板。 阅读全文
posted @ 2015-04-21 23:11
张旭龙
阅读(180)
评论(0)
推荐(0)
摘要:
Andrew ng清华报告听后感
(2013-03-26 23:05:40)
转载▼
Andrew ng今天来清华作报告,我就几点重要的内容,谈谈理解和想法。 阅读全文
posted @ 2015-04-21 23:05
张旭龙
阅读(208)
评论(0)
推荐(0)
摘要:
「深度神经网络」(deep neural network)具体是怎样工作的? 阅读全文
posted @ 2015-04-21 22:45
张旭龙
阅读(141)
评论(0)
推荐(0)
摘要:
无监督学习近年来很热,先后应用于computer vision, audio classification和 NLP等问题,通过机器进行无监督学习feature得到的结果,其accuracy大多明显优于其他方法进行training。本文将主要针对Andrew的unsupervised learning,结合他的视频:unsupervised feature learning by Andrew Ng做出导论性讲解。
关键词:unsupervised learning,feature extraction,feature learning,Sparse Coding,Sparse DBN,Sparse Matrix,Computer Vision,Audio Classification,NLP 阅读全文
posted @ 2015-04-21 22:42
张旭龙
阅读(180)
评论(0)
推荐(0)
摘要:
无监督学习近年来很热,先后应用于computer vision, audio classification和 NLP等问题,通过机器进行无监督学习feature得到的结果,其accuracy大多明显优于其他方法进行training。本文将主要针对Andrew的unsupervised learning,结合他的视频:unsupervised feature learning by Andrew Ng做出导论性讲解。
关键词:unsupervised learning,feature extraction,feature learning,Sparse Coding,Sparse DBN,Sparse Matrix,Computer Vision,Audio Classification,NLP 阅读全文
posted @ 2015-04-21 14:58
张旭龙
阅读(176)
评论(0)
推荐(0)
摘要:
Preliminary stuff that can be useful, depending on your background
About neural networks
About distributed representations
About learning distributed representations for words
About auto-encoders
Learning about relations between symbols
About Monte-Carlo methods
About graphical models
About Boltzmann machines and related energy-based models
About Products of Experts, Restricted Boltzmann Machines and Contrastive Divergence
About deep belief networks as such
Early version: wake-sleep a 阅读全文
posted @ 2015-04-21 14:52
张旭龙
阅读(353)
评论(0)
推荐(0)
摘要:
Welcome to the Public web of LISA
NEW: Presentation at from Yoshua Bengio. Available from videoletures.com
This website serves as an introduction to our research projects, ideas, papers and datasets that we make available to the public. It is a complement to our publications, availablethere. 阅读全文
posted @ 2015-04-21 14:50
张旭龙
阅读(132)
评论(0)
推荐(0)
摘要:
ICML 2009 Workshop on Learning Feature Hierarchies
June 18, 2009 in Montreal, Canada
REFERENCES
In the following, we first list some papers published since 2008, to reflect the new research activities since the last deep learning workshop held at NIPS, Dec 2007, and then list some earlier papers as well. 阅读全文
posted @ 2015-04-21 14:49
张旭龙
阅读(221)
评论(0)
推荐(0)
摘要:
index
next |
previous |
Notes de cours IFT6266 Hiver 2010 »
Table Of Contents
Introduction to Deep Learning Algorithms
Depth
Motivations for Deep Architectures
Insufficient depth can hurt
The brain has a deep architecture
Cognitive processes seem deep
Breakthrough in Learning Deep Architectures
Introduction to Deep Learning Algorithms
See the following article for a recent survey of deep learning:
Yoshua Bengio, Learning Deep Architectures for AI, Foundations and Trends i 阅读全文
posted @ 2015-04-21 14:43
张旭龙
阅读(177)
评论(0)
推荐(0)

浙公网安备 33010602011771号