DelugeNets: Deep Networks with Efficient and Flexible Cross-layer
Information Inflows
Jason Kuen
and
Xiangfei Kong
and
Gang Wang
and
Yap-Peng Tan
arXiv e-Print archive - 2016 via arXiv
Keywords:
cs.CV, cs.LG, cs.NE
First published: 2016/11/17 (7 years ago) Abstract: Deluge Networks (DelugeNets) are deep neural networks which efficiently
facilitate massive cross-layer information inflows from preceding layers to
succeeding layers. The connections between layers in DelugeNets are established
through cross-layer depthwise convolutional layers with learnable filters,
acting as a flexible yet efficient selection mechanism. DelugeNets can
propagate information across many layers with greater flexibility and utilize
network parameters more effectively compared to ResNets, whilst being more
efficient than DenseNets. Remarkably, a DelugeNet model with just model
complexity of 4.31 GigaFLOPs and 20.2M network parameters, achieve
classification errors of 3.76% and 19.02% on CIFAR-10 and CIFAR-100 dataset
respectively. Moreover, DelugeNet-122 performs competitively to ResNet-200 on
ImageNet dataset, despite costing merely half of the computations needed by the
latter.
It's not clear to me where the difference between DenseNets and DelungeNets are.
## Evaluation
* Cifar-10: 3.76% error (DenseNet: )
* Cifar-100: 19.02% error
## See also
* [reddit](https://www.reddit.com/r/MachineLearning/comments/5l0k6w/r_delugenets_deep_networks_with_massive_and/)
* [DenseNet](http://www.shortscience.org/paper?bibtexKey=journals/corr/1608.06993)