随笔分类 -  调参(转载)

摘要:Competing in a data science contest without reading the dataMachine learning competitions have become an extremely popular format for solving predicti... 阅读全文
posted @ 2015-06-13 12:26 菜鸡一枚 阅读(285) 评论(0) 推荐(0) 编辑
摘要:State of Hyperparameter SelectionDANIEL SALTIELVIEW NOTEBOOKHistoricallyhyperparameter determination has been a woefully forgotten aspect of machine l... 阅读全文
posted @ 2015-06-09 19:04 菜鸡一枚 阅读(550) 评论(0) 推荐(0) 编辑
摘要:How to Evaluate Machine Learning Models, Part 4: Hyperparameter TuningIn the realm of machine learning, hyperparameter tuning is a “meta” learning tas... 阅读全文
posted @ 2015-05-28 19:21 菜鸡一枚 阅读(1400) 评论(0) 推荐(0) 编辑
摘要:机器学习算法中如何选取超参数:学习速率、正则项系数、minibatch size本文是《Neural networks and deep learning》概览中第三章的一部分,讲机器学习算法中,如何选取初始的超参数的值。(本文会不断补充)学习速率(learning rate,η)运用梯度下降算法进... 阅读全文
posted @ 2015-05-19 20:39 菜鸡一枚 阅读(694) 评论(0) 推荐(0) 编辑
摘要:正则化方法:L1和L2 regularization、数据集扩增、dropout本文是《Neural networks and deep learning》概览中第三章的一部分,讲机器学习/深度学习算法中常用的正则化方法。(本文会不断补充)正则化方法:防止过拟合,提高泛化能力在训练数据不够多时,或者... 阅读全文
posted @ 2015-05-19 20:36 菜鸡一枚 阅读(3123) 评论(0) 推荐(0) 编辑
摘要:@G_Auss: 一直觉得以稀疏为目标的无监督学习没有道理。稀疏表示是生物神经系统的一个特性,但它究竟只是神经系统完成任务的副产物,还是一个优化目标,没有相关理论,这里有推理漏洞。实际上,稀疏目标只能在第一层有好的结果,层第训练在大数据集上没成功过。@南大周志华:稀疏编码是获得稀疏表示的一种途径 阅读全文
posted @ 2015-05-14 19:06 菜鸡一枚 阅读(375) 评论(0) 推荐(0) 编辑
摘要:Overfitting & RegularizationThe Problem of overfittingA common issue in machine learning or mathematical modeling is overfitting, which occurs when yo... 阅读全文
posted @ 2015-05-14 18:57 菜鸡一枚 阅读(524) 评论(0) 推荐(0) 编辑
摘要:A Brief Overview of Deep Learning(This is a guest post byIlya Sutskeveron the intuition behind deep learning as well as some very useful practical adv... 阅读全文
posted @ 2015-05-13 19:25 菜鸡一枚 阅读(381) 评论(0) 推荐(0) 编辑