rectified units

from: http://en.wikipedia.org/wiki/Rectifier_(neural_networks)

 

In the context of artificial neural networks, the rectifier is an activation function defined as

f(x) = \max(0, x)

Noisy ReLUs[edit]

Rectified linear units can be extended to include Gaussian noise, making them noisy ReLUs, giving[3]

f(x) = \max(0, x + \mathcal{N}(0, \sigma(x)))

Noisy ReLUs have been used with some success in restricted Boltzmann machines for computer vision tasks.[3]

posted @ 2014-10-28 15:29  caoeryingzi  阅读(362)  评论(0编辑  收藏  举报