Fast Algorithms for Gaussian Noise Invariant Independent Component Analysis Fast Algorithms for Gaussian Noise Invariant Independent Component Analysis
Paper summary This paper presents a fast ICA algorithm that works best under Gaussian noise. This is demonstrated with components simulated from different univariate distributions and variable Gaussian noise. The writing is clear. The paper is incremental in the sense that it builds on ideas from (Belkin et. al, 2013) but focuses on speeding up and improving their cumulant-based approach. This is achieved via 1. a Hessian expansion of the cumulant-tensor-based quasi-orthogonalization. 2. gradient-based iterations that preserve quasi-orthogonalization of the latent factors (noised case) as well as whitening in the noiseless case. This paper proposes a cumulant based independent component analysis (ICA) algorithm for source separation in the presence of additive Gaussian noise. The algorithm is somewhat incremental building upon Refs [2] and [3], but appear technically correct with experimental results confirming the claims made. The algorithms used for benchmarking assume no additive noise but is like InfoMax often quite robust to addition of noise.
papers.nips.cc
scholar.google.com
Fast Algorithms for Gaussian Noise Invariant Independent Component Analysis
Voss, James R. and Rademacher, Luis and Belkin, Mikhail
Neural Information Processing Systems Conference - 2013 via Bibsonomy
Keywords: dblp


Loading...
Your comment:


Short Science allows researchers to publish paper summaries that are voted on and ranked!
About