摘要: 1 Recall 1.1 ItemCF Recall: cos similarity, 1.2 Swing: Another ItemCF that calculates the weights of different users. When two items can not calcualte 阅读全文
posted @ 2026-05-06 06:48 ylxn 阅读(5) 评论(0) 推荐(0)
摘要: A book I bought in a supermark in Canada. A way to kick time and learn Engislh. 阅读全文
posted @ 2026-05-05 10:49 ylxn 阅读(6) 评论(0) 推荐(0)
摘要: Contents introduction the word comes to the white mountains the improbable rise of harry white maynard keynes the monetary menace the most unsordid ac 阅读全文
posted @ 2026-01-30 16:16 ylxn 阅读(11) 评论(0) 推荐(0)
摘要: 1 BGD (Batch Gradient Descent) BGD, computes gradient using entire training dataset in each iteration. Makes one update per epoch. $$\theta_{j+1}=\the 阅读全文
posted @ 2026-01-18 12:43 ylxn 阅读(5) 评论(0) 推荐(0)
摘要: 1 Decision Tree 1.1 Information content This measures the surprise or information gained when an event x occurs: $$I(x)=-log_{2}P(x)$$ if P(x) is high 阅读全文
posted @ 2026-01-16 18:09 ylxn 阅读(16) 评论(0) 推荐(0)
摘要: All principles General Principles KISS → Start simple, add complexity only when needed DRY → Reduce duplication, simplify maintenance YAGNI → Build fo 阅读全文
posted @ 2026-01-13 11:47 ylxn 阅读(15) 评论(0) 推荐(0)
摘要: 1 up sampling increase the sample ratio 2 down sampling decrease the sample ratio 3 overfitting Recoganize Training accuracy >> Validation accuracy Pe 阅读全文
posted @ 2025-12-17 15:08 ylxn 阅读(11) 评论(0) 推荐(0)
摘要: 1、 简介 对图片(数据集链接) 进行分类,构建 CNN 网络。(还可以直接使用 Restnet)。CNN 架构: 1.1 Pooling Pooling is the core operation in CNN specifically to designed to reduce the spat 阅读全文
posted @ 2025-12-09 14:23 ylxn 阅读(16) 评论(0) 推荐(0)
摘要: 1、拉格朗日乘子法(等式约束) 问题形式:最小化目标函数f(x),满足h(x) = 0。 核心思想:在最优解 x∗ 处,目标函数的梯度 ∇f 必须与约束曲面的法向量平行。因为如果它们不平行,我们就可以沿着约束曲面“滑动”来进一步降低 f的值。 构造拉格朗日函数:L(x,λ)=f(x)+λh(x)。其 阅读全文
posted @ 2025-12-08 18:42 ylxn 阅读(60) 评论(0) 推荐(0)
摘要: 一 简介 与 Dijkstra 不同的是,A*算法是source 到 target 的最短路径,Dijkstra 是source 到所有点的最短路径。A*算法的核心是评估函数 f(n) = g(n) + h(n)。 g(n) 从起点到节点 n 的实际代价 h(n) 从节点 n 到终点的估计代价(启发 阅读全文
posted @ 2025-12-08 17:55 ylxn 阅读(24) 评论(0) 推荐(0)