上一页 1 ··· 47 48 49 50 51 52 53 54 55 ··· 338 下一页
摘要: pytorch transpose >>> x = torch.randn(2, 3)>>> xtensor([[ 1.0028, -0.9893, 0.5809], [-0.1669, 0.7299, 0.4942]])>>> torch.transpose(x, 0, 1)tensor([[ 1 阅读全文
posted @ 2023-10-08 07:52 emanlee 阅读(64) 评论(0) 推荐(0)
摘要: 全文 https://ieeexplore.ieee.org/document/7526959 Soft Exponential Activation Function A Soft Exponential Activation Function is a parametric neuron act 阅读全文
posted @ 2023-10-08 07:50 emanlee 阅读(33) 评论(0) 推荐(0)
摘要: 时序卷积网络 https://blog.csdn.net/hotpants/article/details/129624190 https://baijiahao.baidu.com/s?id=1677236455062512984&wfr=spider&for=pc https://unit8.c 阅读全文
posted @ 2023-10-07 16:28 emanlee 阅读(36) 评论(0) 推荐(0)
摘要: pytorch torch.nn.BatchNorm1d nn.BatchNorm1d 本身不是给定输入矩阵,输出归一化结果的函数,而是定义了一个方法,再用这个方法去做归一化。下面是一个例子。 import torch import numpy as np from torch import nn 阅读全文
posted @ 2023-10-07 16:27 emanlee 阅读(1901) 评论(0) 推荐(0)
摘要: 时序预测的深度学习算法全面盘点 https://blog.csdn.net/qq_34160248/article/details/131349551 https://it.sohu.com/a/690057464_121124360 https://zhuanlan.zhihu.com/p/393 阅读全文
posted @ 2023-10-07 16:22 emanlee 阅读(35) 评论(0) 推荐(0)
摘要: 神经网络训练时,为什么loss值不稳定,测试集准确率上下浮动? https://www.zhihu.com/question/600770126/answer/3027268624 神经网络训练时,loss值 不稳定往往是由于以下几个原因: 1. 数据集的噪声和不确定性会导致训练时的随机性 ,从而导 阅读全文
posted @ 2023-10-07 13:45 emanlee 阅读(2364) 评论(0) 推荐(0)
摘要: 导数计算器(Derivative Calculator) https://www.derivative-calculator.net/​ a*e^x/(1+abs(x)) 阅读全文
posted @ 2023-10-07 11:49 emanlee 阅读(629) 评论(0) 推荐(0)
摘要: 阅读全文
posted @ 2023-10-07 11:44 emanlee 阅读(13) 评论(0) 推荐(0)
摘要: https://blog.csdn.net/qq_45549605/article/details/126761439 https://m.thepaper.cn/baijiahao_8690116 https://www.zhihu.com/question/289666926/answer/29 阅读全文
posted @ 2023-10-07 11:30 emanlee 阅读(76) 评论(0) 推荐(0)
摘要: python TCP Server https://blog.csdn.net/weixin_45707610/article/details/131511896 下面的代码,要先关闭防火墙,然后再启动 from socketserver import BaseRequestHandler, TCP 阅读全文
posted @ 2023-10-07 11:29 emanlee 阅读(191) 评论(0) 推荐(0)
上一页 1 ··· 47 48 49 50 51 52 53 54 55 ··· 338 下一页