The Neural Autoregressive Distribution EstimatorThe Neural Autoregressive Distribution EstimatorLarochelle, Hugo and Murray, Iain2011

Paper summarycubs#### Problem addressed:
Density estimation
#### Summary:
This paper presented a tractable density estimator inspired from RBM. The inspiration comes from the procedure of using mean field approximation for the actual RBM probablities. It turns out that one step of the mean field approximation just corresponds to a one-hidden layer feed forward nerual networks with tied weights.
#### Novelty:
Find the link between the one step mean field approximation on RBM to a one layer neural network, and come up with a tractable density estimator that performs well.
#### Drawbacks:
There still a need to pre-select an ordering of variables before running the algorithm.
#### Datasets:
MNIST, binary observation datasets from Larochelle 2010
#### Additional remarks:
The KL divergence minimization derivation is quite tricky (eq 7, 8)
#### Resources:
#### Presenter:
Yingbo Zhou

#### Problem addressed:
Density estimation
#### Summary:
This paper presented a tractable density estimator inspired from RBM. The inspiration comes from the procedure of using mean field approximation for the actual RBM probablities. It turns out that one step of the mean field approximation just corresponds to a one-hidden layer feed forward nerual networks with tied weights.
#### Novelty:
Find the link between the one step mean field approximation on RBM to a one layer neural network, and come up with a tractable density estimator that performs well.
#### Drawbacks:
There still a need to pre-select an ordering of variables before running the algorithm.
#### Datasets:
MNIST, binary observation datasets from Larochelle 2010
#### Additional remarks:
The KL divergence minimization derivation is quite tricky (eq 7, 8)
#### Resources:
#### Presenter:
Yingbo Zhou