“We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function. The GELU nonlinearity is the expected transformation of a stochastic regularizer which randomly applies the identity or zero map to a neuron’s input. The GELU nonlinearity weights inputs by their magnitude, rather than gates inputs by their sign as in ReLUs.”

[1606.08415] Gaussian Error Linear Units (GELUs)
http://jhavelikes.tumblr.com/post/184177364175

Comments are closed.