tensorflow构造简单的神经网络

学习莫凡的tensorflow  https://www.bilibili.com/video/av16001891/?p=16
构造一个3层的网络
输入层一个结点,隐层10个结点,输出层一个结点
输入层的维度是[n,1]
隐层的维度是  [1,10]
输出层的维度是[10,1]
so,
权值矩阵的维度是:
weight1=[1,10]
bais1=[10,1]
weight2=[10,1]
bais2=[1,1]
直接上代码:
import tensorflow as tf
import numpy as np

def add_layer(inputs,in_size,out_size,activation_funtion=None):
    Weights = tf.Variable(tf.random_normal([in_size,out_size]))
    biases = tf.Variable(tf.zeros([1,out_size]) + 0.1)
    plus = tf.matmul(inputs,Weights) + biases
    if activation_funtion is None:
        outputs = plus
    else:
        outputs = activation_funtion(plus)
    return outputs

#定义数据集
x_date = np.linspace(-1,1,300)[:,np.newaxis]#1列
noise = np.random.normal(0,0.05,x_date.shape)#使用高斯正态分布建立噪点,
y_date = np.square(x_date) - 0.5 + noise#定义一个二次函数模型

#定义placeholder站位
xs = tf.placeholder(tf.float32,[None,1])
ys = tf.placeholder(tf.float32,[None,1])

#添加第一个层
l1 = add_layer(xs,1,10,activation_funtion=tf.nn.relu)

#添加第二个层
out = add_layer(l1,10,1,activation_funtion=None)

#loss函数以及使用梯度下降方法来优化
loss = tf.reduce_mean(tf.reduce_sum(tf.square(y_date - out),reduction_indices=[1]))
tain_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

init = tf.global_variables_initializer()
with tf.Session() as sess:
    sess.run(init)
    
    for i in range(1000):
        
        sess.run(tain_step,feed_dict = {xs:x_date,ys:y_date})
        if i%10 == 0:
            print("loss :",sess.run(loss,feed_dict={xs:x_date,ys:y_date}))

结果:

loss : 1.4877208
loss : 0.019836485
loss : 0.012518918
loss : 0.010255886
loss : 0.008874094
loss : 0.007963703
loss : 0.007332341
loss : 0.006866762
loss : 0.00650165
loss : 0.006215293
loss : 0.00597182
loss : 0.0057607642
loss : 0.00557566
loss : 0.00541009
loss : 0.0052635786
loss : 0.005134573
loss : 0.005019676
loss : 0.0049204617
loss : 0.004835516
loss : 0.0047629676
loss : 0.0046962486
loss : 0.0046354295
loss : 0.004577974
loss : 0.004523345
loss : 0.0044735144
loss : 0.0044238633
loss : 0.0043758764
loss : 0.004328832
loss : 0.004284634
loss : 0.0042413296
loss : 0.004201067
loss : 0.0041636135
loss : 0.0041273576
loss : 0.004090899
loss : 0.0040571215
loss : 0.0040248944
loss : 0.0039935075
loss : 0.00396269
loss : 0.00393342
loss : 0.003904486
loss : 0.0038752127
loss : 0.0038460745
loss : 0.0038179613
loss : 0.003790757
loss : 0.0037632163
loss : 0.0037360494
loss : 0.0037103533
loss : 0.0036865429
loss : 0.0036634353
loss : 0.0036409772
loss : 0.0036192313
loss : 0.003596954
loss : 0.0035748594
loss : 0.0035531824
loss : 0.0035320013
loss : 0.0035111476
loss : 0.0034906054
loss : 0.0034705445
loss : 0.0034508999
loss : 0.0034321418
loss : 0.0034140071
loss : 0.0033953292
loss : 0.003376886
loss : 0.00335881
loss : 0.0033409866
loss : 0.0033231534
loss : 0.003306121
loss : 0.0032898246
loss : 0.0032742782
loss : 0.003259402
loss : 0.0032452985
loss : 0.0032321964
loss : 0.0032193994
loss : 0.0032069928
loss : 0.0031946145
loss : 0.0031817306
loss : 0.0031688544
loss : 0.003156051
loss : 0.0031430128
loss : 0.0031302636
loss : 0.003117249
loss : 0.0031045172
loss : 0.003090829
loss : 0.0030770234
loss : 0.0030633674
loss : 0.0030499054
loss : 0.0030367174
loss : 0.003023865
loss : 0.003011504
loss : 0.003000096
loss : 0.0029886991
loss : 0.0029776876
loss : 0.0029670242
loss : 0.002956797
loss : 0.002945893
loss : 0.002935217
loss : 0.0029246148
loss : 0.0029140003
loss : 0.0029035627
loss : 0.0028942765

可以看到loss函数在不断的下降。

 

posted @ 2019-08-08 17:05  也许明天、  阅读(145)  评论(0)    收藏  举报