Particle Gibbs for Infinite Hidden Markov Models Particle Gibbs for Infinite Hidden Markov Models
Paper summary The paper proposes a sampler for iHMMs, which the authors show has improved mixing properties and performs better in posterior inference problems when compared to the existing state-of-the-art sampling methods. An existing Gibbs sampler is turned into a particle Gibbs sampler by using a conditional SMC step to sample the latent sequence of states. The paper uses conjugacy to derive optimal SMC proposals and ancestor sampling to improve the performance of the conditional SMC step. The result is more efficient sampling of the latent states, making the sampler robust to spurious states and yielding faster convergence.

Summary by NIPS Conference Reviews 5 years ago
Your comment: allows researchers to publish paper summaries that are voted on and ranked!

Sponsored by: and