Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression
Paper summary In a GP regression model, the process outputs can be integrated over analytically, but this is not so for (a) inputs and (b) kernel hyperparameters. Titsias etal 2010 showed a very clever way to do (a) with a particular variational technique (the goal was to do density estimation). In this paper, (b) is tackled, which requires some nontrivial extensions of Titsias etal. In particular, they show how to decouple the GP prior from the kernel hyperparameters. This is a simple trick, but very effective for what they want to do. They also treat the large number of kernel hyperparameters with an additional level of ARD and show how the ARD hyperparameters can be solved for analytically, which is nice.
papers.nips.cc
scholar.google.com
Variational Inference for Mahalanobis Distance Metrics in Gaussian Process Regression
Titsias, Michalis K. and Lázaro-Gredilla, Miguel
Neural Information Processing Systems Conference - 2013 via Bibsonomy
Keywords: dblp


Loading...
Your comment:


Short Science allows researchers to publish paper summaries that are voted on and ranked!
About