算法学习记录

算法应用建议与解决思路:https://blog.csdn.net/han_xiaoyang/article/details/50469334

jupyter 基础:https://blog.csdn.net/WilsonSong1024/article/details/81252362
jupyter 魔法命令:https://blog.csdn.net/sinat_22840937/article/details/80315378


numpy 大全:https://blog.csdn.net/qq_31813549/article/details/79954437
numpy 矩阵运算:https://www.cnblogs.com/chamie/p/4870078.html
聚合运算:https://blog.csdn.net/qq_34734683/article/details/79462025
arg 运算:https://www.jianshu.com/p/9927d79a79c0
Numpy中的比较 :https://www.jianshu.com/p/c2e3bf711a63


matplotlib(基础整理):https://blog.csdn.net/cymy001/article/details/78344316

数学基础总结:http://www.sohu.com/a/249497084_787107
卡方检验:https://blog.csdn.net/ludan_xia/article/details/81737669?tdsourcetag=s_pcqq_aiomsg

第一个算法:
knn详解: http://cuijiahua.com/blog/2017/11/ml_1_knn.html
scikit-learn机器学习算法封装:https://www.jianshu.com/p/a6a1175cb5b4

训练数据集:https://blog.csdn.net/sparkapi/article/details/79761283
理解召回率精确率准确率https://blog.csdn.net/colourful_sky/article/details/72810363
分类问题准确率召回率:https://blog.csdn.net/Yan456jie/article/details/47022693

超参数:https://blog.csdn.net/shenxiaoming77/article/details/76849929

网格搜索高效参数调优:https://blog.csdn.net/JasonDing1354/article/details/50562522

数据归一化:https://blog.csdn.net/leiting_imecas/article/details/54986045
归一化原因:https://blog.csdn.net/wuxiaosi808/article/details/78059051

鸢尾花分类(KNN)https://blog.csdn.net/qq_39422642/article/details/77618375

scikit_learn数据预处理:https://blog.csdn.net/u010472823/article/details/53509658

knn全面思考:https://blog.csdn.net/yexudengzhidao/article/details/81067412


第二个算法:线性回归

推导系列:
线性回归: https://blog.csdn.net/yjp19871013/article/details/79827047
代价函数推导:https://blog.csdn.net/qq_39494028/article/details/81611217
梯度下降算法推导:https://blog.csdn.net/u012421852/article/details/79558833
正规方程推导:https://blog.csdn.net/perfect_accepted/article/details/78383434

简单线性回归:https://blog.csdn.net/Wing_93/article/details/78916547
最小二乘法的推导:https://blog.csdn.net/winone361/article/details/50713088
实现+衡量指标:https://blog.csdn.net/Wing_93/article/details/78916547
R Squared衡量结果:https://blog.csdn.net/qq_37610062/article/details/82532243
岭回归:https://blog.csdn.net/google19890102/article/details/27228279
岭回归和最小二乘法对比:https://blog.csdn.net/dang_boy/article/details/78504258

LASSO:
本身问题:https://blog.csdn.net/xidianzhimeng/article/details/20856047
算法实现:https://blog.csdn.net/mousever/article/details/50513409

协方差矩阵:https://blog.csdn.net/shenziheng1/article/details/52955687
RMSE,MAE,SD(均方根误差,平均绝对误差,标准差)
RMSE,MAE,SD:https://blog.csdn.net/cqfdcw/article/details/78173839
三种梯度下降方式:https://blog.csdn.net/UESTC_C2_403/article/details/74910107
随机梯度下降:https://www.cnblogs.com/volcao/p/9144362.html
sklearn实现:https://blog.csdn.net/wong2016/article/details/80712406
sklearn梯度下降总结:https://blog.csdn.net/sinat_23338865/article/details/80630918
RMSE,MAE,SDhttps://blog.csdn.net/wordwarwordwar/article/details/61446257
多元线性回归:https://blog.csdn.net/itJed/article/details/77879002
实现:https://blog.csdn.net/LULEI1217/article/details/49386295
正规方程解(推导):https://blog.csdn.net/melon__/article/details/80589759
实战类:
解决房价预测问题:https://blog.csdn.net/hb707934728/article/details/70054620
https://www.cnblogs.com/pinard/p/6016029.html
波士顿房价预测:https://blog.csdn.net/weixin_36627946/article/details/70240328?locationNum=10&fps=1

梯度下降法详解:https://blog.csdn.net/Chenyukuai6625/article/details/74398043
向量化含推导:https://www.cnblogs.com/zhongmiaozhimen/p/6134609.html
为什么可以向量化:https://blog.csdn.net/keypig_zz/article/details/80146914
深入了解:https://blog.csdn.net/wangjian1204/article/details/50284455
https://www.jianshu.com/p/c7e642877b0e

降维系列:
PCA理解:https://blog.csdn.net/xizhibei/article/details/7536022
PCA原理:https://blog.csdn.net/daaikuaichuan/article/details/53444639
数据主成分分析:https://blog.csdn.net/u013159040/article/details/45645729
高维映射成低维数据:http://www.cnblogs.com/volcao/p/9221515.html
scikit-learn实现PCA:https://blog.csdn.net/xlinsist/article/details/51332074
PCA+KNN(MNIST手写识别实战):https://www.cnblogs.com/princecoding/p/6043658.html
PCA 可视化 MNIST :https://blog.csdn.net/u010099080/article/details/53560426
PCA知识点总结(MNIST数据集)https://blog.csdn.net/Liukx940818/article/details/64922419
PCA图像降噪例子:https://blog.csdn.net/weixin_42039090/article/details/80518628
PCA特征提取:https://blog.csdn.net/u010182633/article/details/45918737
人脸识别之特征脸方法:https://blog.csdn.net/zouxy09/article/details/45276053

多项式回归:
什么是多项式(理论):https://blog.csdn.net/fanfan4569/article/details/81273774
简介+原理+sklearn实现:https://blog.csdn.net/weixin_42039090/article/details/80740330
转换为线性回归:https://blog.csdn.net/guoyunfei20/article/details/78552892

过拟合(原因、解决方案、原理):https://blog.csdn.net/a819825294/article/details/51239686
过,欠拟合原因和措施:https://blog.csdn.net/qq_18254385/article/details/78428887

过拟合解决方法(学习曲线)
浅理解:https://blog.csdn.net/tdkcs/article/details/38318143
判别过拟合问题:https://blog.csdn.net/aliceyangxi1987/article/details/73598857
绘制图:https://blog.csdn.net/yangzhiyouvl/article/details/53955332
含验证曲线:https://blog.csdn.net/ChenVast/article/details/79257387


交叉验证(解决过拟合):
为什么使用:https://blog.csdn.net/aliceyangxi1987/article/details/73532651
方法总结:https://blog.csdn.net/linkin1005/article/details/42869331
选择合适方法:https://blog.csdn.net/yueguizhilin/article/details/77711789
scikit-learn实现:https://blog.csdn.net/jasonding1354/article/details/50562513/
python实现:https://blog.csdn.net/Chaolei3/article/details/79270939
https://blog.csdn.net/Dream_angel_Z/article/details/47110077
详解:https://blog.csdn.net/lhx878619717/article/details/49079785


训练集,测试集,验证集:
使用原因:https://blog.csdn.net/Neleuska/article/details/73193096
如何划分:https://blog.csdn.net/UESTC_C2_403/article/details/77745788
三者关系:https://blog.csdn.net/Losteng/article/details/50766252


权衡偏差方差:https://blog.csdn.net/qq_30490125/article/details/52401773
模型泛化:https://blog.csdn.net/tiankong_/article/details/78361496

 


第三个算法(逻辑回归):
通俗易懂:https://blog.csdn.net/pakko/article/details/37878837
详解简单易懂:https://blog.csdn.net/wangbowj123/article/details/79332853
详细公式推导:https://blog.csdn.net/u012421852/article/details/79614417
逻辑回归: https://www.2cto.com/kf/201611/566894.html
https://blog.csdn.net/ligang_csdn/article/details/53838743
代价函数的推导公式按照极大似然估计法来计算
极大似然估计详解:https://blog.csdn.net/zengxiantao1994/article/details/72787849
梯度下降详细推导:https://blog.csdn.net/wgdzz/article/details/48816307
算法思路+实现:https://blog.csdn.net/lanyanchenxi/article/details/77996271
代价函数详解:https://blog.csdn.net/bitcarmanlee/article/details/51165444

决策边界:
理解:https://blog.csdn.net/xinzhi8/article/details/67639031/
实战:https://blog.csdn.net/zhangyingchengqi/article/details/54808853
多项式特征使用与否的区别:https://www.cnblogs.com/volcao/p/9385930.html
scikit-learn实战:https://github.com/AugusXJ/scikit_learn/tree/master/LogisticRegression
类库参数使用:https://blog.csdn.net/sun_shengyun/article/details/53811483
二分类与多分类理解:https://blog.csdn.net/u011734144/article/details/79717470

评价分类结果系列:
准确度的陷阱和混淆矩阵:https://blog.csdn.net/qq_34374664/article/details/80358916
准确率和召回率:https://blog.csdn.net/lz_peter/article/details/78133069
实现混淆矩阵:https://blog.csdn.net/xyisv/article/details/80456649
ROC曲线:https://blog.csdn.net/u010707315/article/details/78950860
Roc曲线取最佳阈值:https://blog.csdn.net/sunxingxingtf/article/details/42751295
多分类问题混淆矩阵:https://blog.csdn.net/GarfieldEr007/article/details/51050195


第四个算法(支持向量机SVM):
感知机通俗讲解:https://www.jianshu.com/p/565ce8f237a6#
认识感知机:https://blog.csdn.net/yxhlfx/article/details/79093456
通俗导论:https://blog.csdn.net/v_JULY_v/article/details/7624837
结合理解1:https://blog.csdn.net/han_xiaoyang/article/details/52678373
结合理解2:https://blog.csdn.net/The_lastest/article/details/78513158?locationNum=9&fps=1
故事理解SVM:https://blog.csdn.net/weixin_35909255/article/details/72902401

面试问题总结:https://blog.csdn.net/szlcw1/article/details/52259668
全面介绍:https://blog.csdn.net/AMDS123/article/details/53696027
通俗理解:https://blog.csdn.net/macyang/article/details/38782399
推导求解:https://blog.csdn.net/American199062/article/details/51322852
https://blog.csdn.net/lisi1129/article/details/70209945
搞懂SVM的三个问题,间隔,对偶问题,KKT条件:
https://blog.csdn.net/u014472643/article/details/79612204

最优化问题:
间隔:https://blog.csdn.net/ljp812184246/article/details/49361473
对偶:
理解:https://blog.csdn.net/chengleisheng/article/details/40822313
推导:https://blog.csdn.net/HappyRocking/article/details/80772283
模型:https://blog.csdn.net/diligent_321/article/details/53396682
https://blog.csdn.net/dcrmg/article/details/53000150
KKT条件:https://blog.csdn.net/xianlingmao/article/details/7919597

拉格朗日乘子法(详细推导+思路理解)与KKT:
https://blog.csdn.net/xiaoxiaoley/article/details/80761880

软间隔SVM:https://blog.csdn.net/robin_xu_shuai/article/details/77051258


scikit-learn之SVM:https://blog.csdn.net/gamer_gyt/article/details/51265347
scikit-learn参数说明:https://blog.csdn.net/szlcw1/article/details/52336824

svm使用多项式特征:https://blog.csdn.net/weixin_39881922/article/details/80251055

svm常用核函数:https://blog.csdn.net/batuwuhanpei/article/details/52354822
核函数的选择:https://blog.csdn.net/leonis_v/article/details/50688766
高斯核函数详细分析:https://blog.csdn.net/jorg_zhao/article/details/52687448
RBF中的gamma说明:https://blog.csdn.net/ITpfzl/article/details/82831301

使用SVM解决回归问题:https://blog.csdn.net/m0_37725003/article/details/81094448
SVM与SVR:https://blog.csdn.net/liulina603/article/details/8556009
SVR代码实现:https://blog.csdn.net/weixin_39881922/article/details/80256615


第五个算法(决策树)
推荐理解:https://blog.csdn.net/gumpeng/article/details/51397737

先理解一下:https://blog.csdn.net/HerosOfEarth/article/details/52347820
故事(熵的理解):https://blog.csdn.net/qq_40875866/article/details/79438384
信息熵的理解:https://blog.csdn.net/qq_39521554/article/details/79078917
信息熵定义成-Σp*log(p)原因:https://blog.csdn.net/taoqick/article/details/72852255
条件熵的理解:https://blog.csdn.net/xwd18280820053/article/details/70739368
相对熵的理解:https://blog.csdn.net/guo1988kui/article/details/78427409
熵总结:https://blog.csdn.net/marsggbo/article/details/77533194
剪枝原理:https://blog.csdn.net/zhengzhenxian/article/details/79083643
剪枝处理:https://blog.csdn.net/u014688145/article/details/53326910
ID3算法:https://blog.csdn.net/ACdreamers/article/details/44661149
C4.5算法:https://blog.csdn.net/zjsghww/article/details/51638126
决策树特点:https://blog.csdn.net/xuxiatian/article/details/54340428
总结:https://blog.csdn.net/zhurui_idea/article/details/54646932
https://blog.csdn.net/App_12062011/article/details/52136117
完整构建过程:https://www.cnblogs.com/yonghao/p/5061873.html
案例代码:https://blog.csdn.net/zx10212029/article/details/49617179


延伸--集成学习:
延伸感受:https://www.cnblogs.com/qwj-sysu/p/5945370.html
原理详解:https://blog.csdn.net/qq_36330643/article/details/77621232
代码实现:https://blog.csdn.net/shine19930820/article/details/75209021
常用方法理解:https://blog.csdn.net/Chenyukuai6625/article/details/73692347

延伸--随机森林:
推荐理解:https://www.cnblogs.com/maybe2030/p/4585705.html
通俗理解:https://blog.csdn.net/class_brick/article/details/78778786
算法学习:https://blog.csdn.net/qq547276542/article/details/78304454
总结:https://blog.csdn.net/y0367/article/details/51501780
分类/回归代码实现:https://blog.csdn.net/jiede1/article/details/78245597

第六个算法(朴素贝叶斯):
简介:https://blog.csdn.net/carson2005/article/details/6854005
故事理解:https://blog.csdn.net/fisherming/article/details/79502740
图片理解:https://blog.csdn.net/wg95272003/article/details/51554891
进一步理解:https://blog.csdn.net/fisherming/article/details/79509025
进一步分析:https://blog.csdn.net/weixin_40759186/article/details/79386447
算法推导详解:https://blog.csdn.net/anneqiqi/article/details/59666980
总结:https://blog.csdn.net/yanghonker/article/details/51505068
垃圾邮件分类:https://blog.csdn.net/u013634684/article/details/49669081
拼写检查器:https://blog.csdn.net/wenyichuan/article/details/78572007

关联分析:https://www.sohu.com/a/167122772_197042
关联FP:https://www.jianshu.com/p/68caf209376b
FP关联详解:https://blog.csdn.net/huagong_adu/article/details/17739247
apriori算法详解:https://blog.csdn.net/lizhengnanhua/article/details/9061755
fp和apriori得区别:https://zhidao.baidu.com/question/690616149640014084.html
理解数据项集:http://www.cnblogs.com/beaver-sea/p/4740774.html


集成学习:
Bootstrap方法详解:https://blog.csdn.net/baimafujinji/article/details/50554664
通俗理解:https://blog.csdn.net/wangqi880/article/details/49765673
bagging与boosting:https://www.cnblogs.com/earendil/p/8872001.html
Adaboost算法原理分析和实例:https://blog.csdn.net/guyuealian/article/details/70995333
原理推导:https://blog.csdn.net/v_july_v/article/details/40718799

 


傅里叶变换(通俗易懂)https://blog.csdn.net/guyuealian/article/details/72817527

 

posted @ 2020-02-29 19:31  smile七友  阅读(163)  评论(0)    收藏  举报