Self-Normalizing Neural Networks Self-Normalizing Neural Networks
Paper summary "Using the "SELU" activation function, you get better results than any other activation function, and you don't have to do batch normalization. The "SELU" activation function is: if x<0, 1.051\*(1.673\*e^x-1.673) if x>0, 1.051\*x" Source: narfon2, reddit ``` import numpy as np def selu(x): alpha = 1.6732632423543772848170429916717 scale = 1.0507009873554804934193349852946 return scale*np.where(x>=0.0, x, alpha*np.exp(x)-alpha) ``` Source: CaseOfTuesday, reddit Discussion here: https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/
arxiv.org
arxiv-sanity.com
scholar.google.com
Self-Normalizing Neural Networks
Günter Klambauer and Thomas Unterthiner and Andreas Mayr and Sepp Hochreiter
arXiv e-Print archive - 2017 via arXiv
Keywords: cs.LG, stat.ML

more

Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About