文章分类 -  深度学习(转载)

1
摘要:【角度】一文教你如何确定好的“学习率” 本文转载自:专知,来源:专知内容组(编) 【导读】近日,数据科学家Hafidz Zulkifli发布一篇文章,主要讲解了深度学习中的“学习率”,以及如何利用学习率来提高深度学习模型的性能并减少训练时间。作者从“学习率”入手,逐层抽丝剥茧教我们深入理解深度学习中 阅读全文
posted @ 2018-07-11 10:10 菜鸡一枚 阅读(1029) 评论(0) 推荐(0)
摘要:Learning rate这件小事 1. Learning Rate Finder Deep learning models are typically trained by a stochastic gradient descent optimizer. There are many variat 阅读全文
posted @ 2018-07-11 09:56 菜鸡一枚 阅读(4072) 评论(0) 推荐(0)
摘要:不同loss函数在不同网络结构下的误差后传 前言 推导下最小均方差(MSE)和交叉熵(CE)两种loss函数的导数,看看还是否满足误差后传的原则?有什么区别? 在一般地网络结构下是怎么样的?在CNN下又是怎么推导的?在RNN结构下(LSTM)又是怎么推导的?直接讨论多元判别的情况,二元读者自行推导。 阅读全文
posted @ 2018-07-02 19:53 菜鸡一枚 阅读(716) 评论(0) 推荐(0)
摘要:Transposed Convolution, Fractionally Strided Convolution or Deconvolution 反卷积(Deconvolution)的概念第一次出现是Zeiler在2010年发表的论文Deconvolutional networks中,但是并没有指 阅读全文
posted @ 2017-05-16 11:04 菜鸡一枚 阅读(197) 评论(0) 推荐(0)
摘要:Lessons learned from manually classifying CIFAR-10 CIFAR-10 Note, this post is from 2011 and slightly outdated in some places. Statistics. CIFAR-10 co 阅读全文
posted @ 2017-01-11 11:09 菜鸡一枚 阅读(120) 评论(0) 推荐(0)
该文被密码保护。
posted @ 2016-10-23 19:14 菜鸡一枚 阅读(0) 评论(0) 推荐(0)
摘要:Preliminary Note on the Complexity of a Neural Network This post is a preliminary note on the “complexity” of neural networks. It’s a topic that has n 阅读全文
posted @ 2016-08-31 17:01 菜鸡一枚 阅读(149) 评论(0) 推荐(0)
摘要:Where does the Sigmoid in Logistic Regression come from? « A Note on the Graph Laplacian « A Note on the Graph Laplacian Where does the Sigmoid in Log 阅读全文
posted @ 2016-06-05 20:14 菜鸡一枚 阅读(176) 评论(0) 推荐(0)
摘要:Deep Learning in Neural Networks: An Overview Deep Learning in Neural Networks: An Overview – Schmidhuber 2014 What a wonderful treasure trove this pa 阅读全文
posted @ 2016-04-25 08:32 菜鸡一枚 阅读(844) 评论(0) 推荐(0)
摘要:How can I know if Deep Learning works better for a specific problem than SVM or random forest? If we tackle a supervised learning problem, my advice i 阅读全文
posted @ 2016-04-25 08:31 菜鸡一枚 阅读(169) 评论(0) 推荐(0)
该文被密码保护。
posted @ 2016-04-09 19:30 菜鸡一枚 阅读(0) 评论(0) 推荐(0)
摘要:机器学习(Machine Learning)&深度学习(Deep Learning)资料(Chapter 2) 《Image Scaling using Deep Convolutional Neural Networks》 介绍:使用卷积神经网络的图像缩放. 《Proceedings of The 阅读全文
posted @ 2016-04-08 10:10 菜鸡一枚 阅读(526) 评论(0) 推荐(0)
摘要:机器学习(Machine Learning)&深度学习(Deep Learning)资料(Chapter 1) 《Brief History of Machine Learning》 介绍:这是一篇介绍机器学习历史的文章,介绍很全面,从感知机、神经网络、决策树、SVM、Adaboost到随机森林、D 阅读全文
posted @ 2016-04-08 10:07 菜鸡一枚 阅读(940) 评论(0) 推荐(0)
摘要:Compilation of Useful Deep Learning Resources Note : This list is incomplete and open for updates. If you know some useful resource which has not been 阅读全文
posted @ 2016-04-08 09:54 菜鸡一枚 阅读(154) 评论(0) 推荐(0)
该文被密码保护。
posted @ 2016-03-01 19:31 菜鸡一枚 阅读(0) 评论(0) 推荐(0)
该文被密码保护。
posted @ 2016-02-20 21:32 菜鸡一枚 阅读(1) 评论(0) 推荐(0)
该文被密码保护。
posted @ 2016-01-07 20:11 菜鸡一枚 阅读(0) 评论(0) 推荐(0)
摘要:Understanding Convolution in Deep LearningPosted ByTim Dettmerson Mar 26, 2015 |61 commentsConvolution is probably the most important concept in deep ... 阅读全文
posted @ 2015-12-20 18:10 菜鸡一枚 阅读(389) 评论(0) 推荐(0)
摘要:Top 10 Deep Learning Tips & TricksDeep Learning has been at the forefront of data science innovations throughout 2015. Dr. Arno Candel offers help thr... 阅读全文
posted @ 2015-12-15 20:17 菜鸡一枚 阅读(463) 评论(0) 推荐(0)
摘要:Deep Learning’s AccuracyDeep learning has knocked down one record after another on benchmark dataset after benchmark dataset since 2006. In many compe... 阅读全文
posted @ 2015-11-25 20:24 菜鸡一枚 阅读(250) 评论(0) 推荐(0)

1