Neural Network and DeepLearning (1.1)使用神经网络识别手写数字

1.perceptrons(感知器)

A perceptron takes several binary inputs ,x1,x2,...,and produce a single binary output:

  

weights(权重):real numbers expressing the importance of the respective inputs to the output.

The neuron's output, 0or1, is determined by whether the weighted sum is less than or greater than some threshold value(阈值). Just like the weights, the threshold is a real number which is a parameter of the neuron.

alebraic term:    

           

rewritten:

 

a NAND gate(与非门)

  example:

  

0,0-->positive

0,1-->positive

1,0-->positive    

1,1-->negative

we can use perceptrons to compute simple logical functions.

In fact, we can use networks of perceptrons to compute any logical function at all.

  

2.sigmoid neurons(S型神经元)

HOW:small changes in any weight (or bias) causes only a small corresponding change in the output  .

    

Changing the weights and biases over and over to produce better and better output. The network would be learning.

sigmoid function:

  

The output of a sigmoid neuron with input x1,x2,..., weights w1,w2,..., and bias b ,is:

  

 

z=w.x+b     output(即σ(z))

positive     1

negative    0

In fact, the exact form of the function isn't so important - what really matters is the shape of the function when plotted.

  

This shape is a smoothed out version of a step function:  

    

 is well approximated by    

  


3.The architecture of neural networks(神经网络的架构)

input layer, output layer, hidden layer(means nothing more than "not an input or an output")  

example:  

  

 

feedforward networks:(前馈神经网络)  

  There are no loops in the network - information is always fed forward, never fed back.

recurrent neural networks:(递归神经网络)

  The idea in these models is to have neurons which fire for some limited duration of time(具有休眠前会在一段有限的时间内保持激活状态的神经元, before becoming quiescent. That firing can stimulate other neurons, which may fire a little while later, also for a limited duration. That causes still more neurons to fire, and so over time we get a cascade of neurons firing. Loops don't cause problems in such a model, since a neuron's output only affects its input at some later time, not instantaneously.

4.A simple network to classify handwritten digits(一个简单的分类手写数字的网络)

Two sub-problems

(1)breaking an image containing many digits into a sequence of separate images, each containing a single digit.

for example break the image    

into six separate images,

(2)classify each individual digit.

recognize that the digit

is a 5.

 

A three-layer neural network:

 with 784 neurous in input layer 15 neurous in hidden layer and 10 neurous in output layer.

posted @ 2017-03-02 11:34  zhoulixue  阅读(854)  评论(0编辑  收藏  举报