Dynamic Node Creation in Backpropagation Networks Dynamic Node Creation in Backpropagation Networks
Paper summary Dynamic Node Creation (DNC) is about topology learning. DNC sequentially adds single nodes to the network until the desired accuracy is achieved. DNC uses the logistic activation function and creates layered feed-forward architectures with only one hidden layer. So basically they only added one neuron at a time to the existing hidden layer. They expected this to be sufficient, as it was shown that networks with only one hidden layer can model "any function of interest to an arbitrary selected precision". However, the number of neurons might be really huge and much larger than if one used more layers. ## Related Work See [Meiosis Networks summary](http://www.shortscience.org/paper?bibtexKey=conf/nips/Hanson89#martinthoma) for many topology learning papers
Dynamic Node Creation in Backpropagation Networks
Ash, Timur
Connection Science - 1989 via Bibsonomy
Keywords: nn

Your comment:

ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!