摘要: 目录 Outline Auto-Encoder 创建编解码器 训练 Outline Auto-Encoder Variational Auto-Encoders Auto-Encoder 创建编解码器 import os import tensorflow as tf import numpy as 阅读全文
posted @ 2020-12-11 23:58 ABDM 阅读(95) 评论(0) 推荐(0)
摘要: 目录AE v.s. VAEGenerative modelVAE v.s. GAN AE v.s. VAE Generative model VAE v.s. GAN 阅读全文
posted @ 2020-12-11 23:57 ABDM 阅读(80) 评论(0) 推荐(0)
摘要: 目录Sample() is not differentiableReparameterization trickToo Complex Sample() is not differentiable 现在我们得到的不是一个向量,得到的是一个分布,而分布是无法使用梯度下降的 Reparameteriza 阅读全文
posted @ 2020-12-11 23:55 ABDM 阅读(200) 评论(0) 推荐(0)
摘要: 目录Another Approach: q(z)->p(z)Intuitively comprehend KL(p|q)Minimize KL DivergenceHow to compute KL between q(z) and p(z) Distribution of hidden code 阅读全文
posted @ 2020-12-11 23:53 ABDM 阅读(90) 评论(0) 推荐(0)
摘要: 目录PCA V.S. Auto-EncodersDenoising AutoEncodersDropout AutoEncoders PCA V.S. Auto-Encoders deep autoencoder由深度神经网络构成,因此降维效果丢失数据少 左pca;右auto-encoder Den 阅读全文
posted @ 2020-12-11 23:51 ABDM 阅读(78) 评论(0) 推荐(0)
摘要: 目录Auto-EncodersHow to Train? Auto-Encoders How to Train? 阅读全文
posted @ 2020-12-11 23:49 ABDM 阅读(79) 评论(0) 推荐(0)
摘要: 目录Supervised LearningMassive Unlabeled dataUnsupervised LearningWhy needed Supervised Learning Massive Unlabeled data Unsupervised Learning Why needed 阅读全文
posted @ 2020-12-11 23:48 ABDM 阅读(74) 评论(0) 推荐(0)
摘要: 目录 Sentiment Analysis Two approaches Single layer Multi-layers Sentiment Analysis Two approaches SimpleRNNCell single layer multi-layers RNNCell Singl 阅读全文
posted @ 2020-12-11 23:45 ABDM 阅读(257) 评论(0) 推荐(0)
摘要: 目录Recapinput dim, hidden dimSimpleRNNCellSingle layer RNN CellMulti-Layers RNNRNN Layer Recap input dim, hidden dim from tensorflow.keras import layer 阅读全文
posted @ 2020-12-11 23:44 ABDM 阅读(223) 评论(0) 推荐(0)
摘要: 目录RecapSentiment AnalysisProposalS1.Weight sharingNaive versionWeight shareS2.Consistent memoryUnfolded modelFormulationOverall DiagramOne more thingH 阅读全文
posted @ 2020-12-11 23:41 ABDM 阅读(92) 评论(0) 推荐(0)
摘要: 目录 Res Block ResNet18 Out of memory # Resnet.py #!/usr/bin/env python # -*- coding:utf-8 -*- import tensorflow as tf from tensorflow import keras from 阅读全文
posted @ 2020-12-11 23:39 ABDM 阅读(106) 评论(0) 推荐(0)
摘要: 目录 Res Block ResNet18 Out of memory # Resnet.py #!/usr/bin/env python # -*- coding:utf-8 -*- import tensorflow as tf from tensorflow import keras from 阅读全文
posted @ 2020-12-11 23:38 ABDM 阅读(162) 评论(0) 推荐(0)
摘要: 目录ResNetBOOMWhy call Residual?发展史Basic BlockRes BlockResNet-18DenseNet ResNet 确保20层能训练好的前提下,增加8层;然后确保28层能训练好的前提下,继续堆叠8层…… BOOM Why call Residual? 发展史 阅读全文
posted @ 2020-12-11 23:35 ABDM 阅读(102) 评论(0) 推荐(0)
摘要: 目录ImageNetLeNet-5LeNet-5 DemoAlexNetVGG1*1 ConvolutionGoogLeNetStack more layers? ImageNet LeNet-5 LeNet-5 Demo AlexNet VGG 1*1 Convolution GoogLeNet 阅读全文
posted @ 2020-12-11 23:33 ABDM 阅读(107) 评论(0) 推荐(0)
摘要: 目录CIFAR10013 Layerscafar100_train CIFAR100 13 Layers cafar100_train import tensorflow as tf from tensorflow.keras import layers, optimizers, datasets, 阅读全文
posted @ 2020-12-11 23:32 ABDM 阅读(197) 评论(0) 推荐(0)
摘要: 目录OutlineReduce DimsubsampleMax/Avg poolingStridesFor instanceupsampleUpSampling2DReLu Outline Pooling upsample ReLU Reduce Dim subsample Max/Avg pool 阅读全文
posted @ 2020-12-11 23:29 ABDM 阅读(115) 评论(0) 推荐(0)
摘要: 目录2D ConvolutionKernel sizePadding & StrideChannelsFor instanceLeNet-5Pyramid Architecturelayers.Conv2Dweight & biasnn.conv2dGradient?For instance 2D 阅读全文
posted @ 2020-12-11 23:27 ABDM 阅读(93) 评论(0) 推荐(0)
摘要: 目录Feature mapsWhy not Linear335k or 1.3MBem...Receptive FieldFully connnectedPartial connectedLocally connectedRethink Linear layerFully VS LovallyWei 阅读全文
posted @ 2020-12-11 23:25 ABDM 阅读(192) 评论(0) 推荐(0)
摘要: 详情查看:https://www.zhihu.com/question/32246256 阅读全文
posted @ 2020-12-11 23:23 ABDM 阅读(53) 评论(0) 推荐(0)
摘要: 目录 CIFAR10 MyDenseLayer CIFAR10 MyDenseLayer import os import tensorflow as tf from tensorflow.keras import datasets, layers, optimizers, Sequential, 阅读全文
posted @ 2020-12-11 23:21 ABDM 阅读(163) 评论(0) 推荐(0)
摘要: 目录 Outline Save/load weights Save/load entire model saved_model Outline save/load weights # 记录部分信息 save/load entire model # 记录所有信息 saved_model # 通用,包括 阅读全文
posted @ 2020-12-11 23:19 ABDM 阅读(169) 评论(0) 推荐(0)
摘要: 目录 Outline keras.Sequential Layer/Model MyDense MyModel Outline keras.Sequential keras.layers.Layer keras.Model keras.Sequential model.trainable_varia 阅读全文
posted @ 2020-12-11 23:18 ABDM 阅读(75) 评论(0) 推荐(0)
摘要: 目录 Keras != tf.keras Outline1 Metrics Step1.Build a meter Step2.Update data Step3.Get Average data Clear buffer Outline2 Compile + Fit Individual loss 阅读全文
posted @ 2020-12-11 23:15 ABDM 阅读(131) 评论(0) 推荐(0)
摘要: 目录 Keras != tf.keras Outline1 Metrics Step1.Build a meter Step2.Update data Step3.Get Average data Clear buffer Outline2 Compile + Fit Individual loss 阅读全文
posted @ 2020-12-11 23:14 ABDM 阅读(75) 评论(0) 推荐(0)
摘要: FashionMNIST import tensorflow as tf from tensorflow import keras from tensorflow.keras import datasets, layers, optimizers, Sequential, metrics def p 阅读全文
posted @ 2020-12-11 23:11 ABDM 阅读(107) 评论(0) 推荐(0)
摘要: 目录 Himmelblau function Minima Plot Gradient Descent Himmelblau function f(x,y)=(x2+y−11)2+(x+y2−7)2f(x,y)=(x2+y−11)2+(x+y2−7)2 Minima f(3.0,2.0)=0.0 f 阅读全文
posted @ 2020-12-11 23:10 ABDM 阅读(77) 评论(0) 推荐(0)
摘要: 目录 Chain rule Multi-output Perceptron Multi-Layer Perceptron Chain rule Multi-output Perceptron Multi-Layer Perceptron 对于多隐藏层结构的神经网络可以把隐藏层的节点看成输出层的节点 阅读全文
posted @ 2020-12-11 23:09 ABDM 阅读(125) 评论(0) 推荐(0)
摘要: 目录 Derivative Rules Chain rule Derivative Rules Chain rule import tensorflow as tf x = tf.constant(1.) w1 = tf.constant(2.) b1 = tf.constant(1.) w2 = 阅读全文
posted @ 2020-12-11 23:08 ABDM 阅读(570) 评论(0) 推荐(0)
摘要: 目录 Multi-output Perceptron Multi-output Perceptron E=12∑(O1i−ti)2E=12∑(Oi1−ti)2 对于多输出感知机,每个输出元只和输出元上的x和w和σσ 有关。 import tensorflow as tf x = tf.random. 阅读全文
posted @ 2020-12-11 23:06 ABDM 阅读(111) 评论(0) 推荐(0)
摘要: 目录 recap Perceptron Derivative recap y=XW+by=XW+b y=∑xi∗wi+by=∑xi∗wi+b Perceptron x0ixi0 i表示当成第i个节点 w0ijwij0 表示当层的第i个节点,j表示下一个隐藏层的第j个节点 σσ 表示激活函数后的节点 阅读全文
posted @ 2020-12-11 23:03 ABDM 阅读(100) 评论(0) 推荐(0)
摘要: 目录Typical LossMSEDerivativeMSE GradientSoftmaxDerivative Typical Loss Mean Squared Error Cross Entropy Loss binary multi-class +softmax MSE loss=&#x22 阅读全文
posted @ 2020-12-11 23:02 ABDM 阅读(386) 评论(0) 推荐(0)
摘要: 目录Activation FunctionsDerivativeSigmoid/LogisticDerivativetf.sigmoidTanhDerivativetf.tanhRectified Linear UnitDerivativetf.nn.relu Activation Function 阅读全文
posted @ 2020-12-11 23:00 ABDM 阅读(195) 评论(0) 推荐(0)
摘要: 目录OutlineWhat's GradientWhat does it mean?How to searchFor instanceAutoGrad2nd" role="presentation">2nd2nd-order Outline What's Gradient What does it 阅读全文
posted @ 2020-12-11 22:58 ABDM 阅读(136) 评论(0) 推荐(0)
摘要: 目录OutlineMSEEntropyCross EntropyBinary ClassificationSingle outputClassificationWhy not MSE?logits-->CrossEntropy Outline MSE Cross Entropy Loss Hinge 阅读全文
posted @ 2020-12-11 22:56 ABDM 阅读(246) 评论(0) 推荐(0)
摘要: 目录 Outline y∈Rdy∈Rd yi∈[0,1]yi∈[0,1] yi∈[0,1],,∑ydi=0yi=1yi∈[0,1],,∑i=0ydyi=1 yi∈[−1,1]yi∈[−1,1] Outline y∈Rdy∈Rd 多分类一般为概率 yi∈[0,1],i=0,1,⋯,yd−1yi∈[0, 阅读全文
posted @ 2020-12-11 22:55 ABDM 阅读(183) 评论(0) 推荐(0)
摘要: 目录OutlineRecapNeural NetworkHere comes Deep LearningHerosFully connected layerMulti-Layers Outline Matmul Neural Network Deep Learning Multi-Layer Rec 阅读全文
posted @ 2020-12-11 22:53 ABDM 阅读(515) 评论(0) 推荐(0)
摘要: 目录 import tensorflow as tf from tensorflow import keras from tensorflow.keras import datasets import os # do not print irrelevant information # os.env 阅读全文
posted @ 2020-12-11 22:52 ABDM 阅读(91) 评论(0) 推荐(0)
摘要: 目录 Outline keras.datasets MNIST CIFAR10/100 tf.data.Dataset .shuffle .map .batch .repeat() For example Outline keras.datasets tf.data.Dataset.from_ten 阅读全文
posted @ 2020-12-11 22:51 ABDM 阅读(347) 评论(0) 推荐(0)
摘要: 目录 Outline Where where(tensor) where(cond,A,B) scatter_nd 一维 二维 meshgrid Points numpy实现 tensorflow2实现 Outline where scatter_nd meshgrid Where where(te 阅读全文
posted @ 2020-12-11 22:49 ABDM 阅读(98) 评论(0) 推荐(0)
摘要: 目录 Outline clip_by_value relu clip_by_norm gradient clipping Outline clip_by_value relu clip_by_norm gradient clipping clip_by_value import tensorflow 阅读全文
posted @ 2020-12-11 22:47 ABDM 阅读(142) 评论(0) 推荐(0)
摘要: 目录 Outline pad Image padding tile tile VS broadcast_to Outline pad tile broadcast_to pad [3] [[1,2]] [6] [2,2] [[0,1][1,1]] # [行,列] [3,4] import tenso 阅读全文
posted @ 2020-12-11 22:46 ABDM 阅读(117) 评论(0) 推荐(0)
摘要: 目录 Outline Sort/argsort 一维 二维 Top_k Top_one Top-k accuracy 示例 Outline Sort/argsort Topk Top-5 Acc. Sort/argsort 一维 import tensorflow as tf a = tf.rand 阅读全文
posted @ 2020-12-11 22:44 ABDM 阅读(106) 评论(0) 推荐(0)
摘要: 目录 Outline Vector norm Eukl. Norm L1 Norm reduce_min/max/mean argmax/argmin tf.equal Accuracy tf.unique Outline tf.norm tf.reduce_min/max/mean tf.argm 阅读全文
posted @ 2020-12-11 22:42 ABDM 阅读(425) 评论(0) 推荐(0)
摘要: 目录Merge and splitconcatAlong distinct dim/axisstack: create new dimDim mismatchUnstackSplit Merge and split tf.concat tf.split tf.stack tf.unstack con 阅读全文
posted @ 2020-12-11 22:40 ABDM 阅读(146) 评论(0) 推荐(0)
摘要: 目录 手写数字识别流程 前向传播(张量)- 实战 手写数字识别流程 MNIST手写数字集7000*10张图片 60k张图片训练,10k张图片测试 每张图片是28*28,如果是彩色图片是28*28*3 0-255表示图片的灰度值,0表示纯白,255表示纯黑 打平28*28的矩阵,得到28*28=784 阅读全文
posted @ 2020-12-11 22:38 ABDM 阅读(204) 评论(0) 推荐(0)
摘要: 目录 Outline Operation type +-*/%// tf.math.log, tf.exp pow, sqrt @, matmul With broadcasting Y = X@W +b Outline +-*/ **,pow,square sqrt //,% exp,log @, 阅读全文
posted @ 2020-12-11 22:37 ABDM 阅读(103) 评论(0) 推荐(0)
摘要: 目录BroadcastingKey ideaHow to understand?Why broadcasting?Broadcastable?Broadcast VS Tile Broadcasting expand(扩展数据) without copying data(不复制数据) tf.broa 阅读全文
posted @ 2020-12-11 22:35 ABDM 阅读(729) 评论(0) 推荐(0)
摘要: 目录TensorFlow2-维度变换Outline(大纲)图片视图First Reshape(重塑视图)Second Reshape(恢复视图)Transpose(转置)Expand_dims(增加维度)Squeeze(挤压维度) TensorFlow2-维度变换 Outline(大纲) shape 阅读全文
posted @ 2020-12-11 22:33 ABDM 阅读(200) 评论(0) 推荐(0)
摘要: 目录 索引和切片 索引 numpy [ ] 索引 numpy : 索引 切片 一维切片 多维切片 步长::step 倒序::-1 省略号... Selective Indexing gather gather_nd boolean_mask 索引和切片 索引 numpy [ ] 索引 import 阅读全文
posted @ 2020-12-11 22:32 ABDM 阅读(132) 评论(0) 推荐(0)
摘要: 目录创建Tensornumpy, listnumpylistzeros, ones, fillzerosonesfillrandom打乱idx后,a和b的索引不变constantloss计算无bias的loss 创建Tensor * from numpy, list * zeros, ones, f 阅读全文
posted @ 2020-12-11 22:31 ABDM 阅读(138) 评论(0) 推荐(0)
摘要: 目录Tensor数据类型属性数据类型判断数据类型转换tensor转numpy Tensor数据类型 list: [1,1.2,'hello'] ,存储图片占用内存非常大 np.array,存成一个静态数组,但是numpy在深度学习之前就出现了,所以不适合深度学习 tf.Tensor,为了弥补nump 阅读全文
posted @ 2020-12-11 22:29 ABDM 阅读(1514) 评论(0) 推荐(0)