Dynamic Node Creation in Backpropagation NetworksDynamic Node Creation in Backpropagation NetworksAsh, Timur1989
Paper summarymartinthomaDynamic Node Creation (DNC) is about topology learning. DNC sequentially adds single nodes to the network until the desired accuracy is achieved.
DNC uses the logistic activation function and creates layered feed-forward architectures with only one hidden layer. So basically they only added one neuron at a time to the existing hidden layer.
They expected this to be sufficient, as it was shown that networks with only one hidden layer can model "any function of interest to an arbitrary selected precision". However, the number of neurons might be really huge and much larger than if one used more layers.
## Related Work
See [Meiosis Networks summary](http://www.shortscience.org/paper?bibtexKey=conf/nips/Hanson89#martinthoma) for many topology learning papers
Dynamic Node Creation (DNC) is about topology learning. DNC sequentially adds single nodes to the network until the desired accuracy is achieved.
DNC uses the logistic activation function and creates layered feed-forward architectures with only one hidden layer. So basically they only added one neuron at a time to the existing hidden layer.
They expected this to be sufficient, as it was shown that networks with only one hidden layer can model "any function of interest to an arbitrary selected precision". However, the number of neurons might be really huge and much larger than if one used more layers.
## Related Work
See [Meiosis Networks summary](http://www.shortscience.org/paper?bibtexKey=conf/nips/Hanson89#martinthoma) for many topology learning papers