Understanding deep learning requires rethinking generalization Understanding deep learning requires rethinking generalization
Paper summary _Objective:_ Theoretical study of Deep Neural Network, their expressivity and regularizations. ## Results: The key findings of the article are: ### A. Deep neural networks easily fit random labels. Both when randomizing labels, replacing images with raw noise or all situations in-between. 1. The effective capacity of neural networks is sufficient for memorizing the entire data set. 2. Even optimization on random labels remains easy. In fact, training time increases only by a small constant factor compared with training on the true labels. 3. Randomizing labels is solely a data transformation, leaving all other properties of the learning problem unchanged. ### B. Explicit regularization may improve generalization performance, but is neither necessary nor by itself sufficient for controlling generalization error. By explicit regularization they mean batch normalisation, weight decay, dropout, data augmentation, etc. ### C. Generically large neural networks can express any labeling of the training data. More formally, a very simple two-layer ReLU network with `p = 2n + d` parameters can express any labeling of any sample of size `n` in `d` dimensions. ### D. The optimization algorithm itself is implicitly regularizing the solution. SGD acts as an implicit regularizer and properties are inherited by models that were trained using SGD.
arxiv.org
arxiv-sanity.com
scholar.google.com
Understanding deep learning requires rethinking generalization
Chiyuan Zhang and Samy Bengio and Moritz Hardt and Benjamin Recht and Oriol Vinyals
arXiv e-Print archive - 2016 via Local arXiv
Keywords: cs.LG

more

Summary by Martin Thoma 1 year ago
Loading...
Your comment:
Summary by Marek Rei 3 months ago
Loading...
Your comment:
Summary by Joseph Paul Cohen 6 months ago
Loading...
Your comment:
Summary by Léo Paillier 4 months ago
Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About

Sponsored by: and