Efficient variational inference for Gaussian process regression networks

Date

Authors

Nguyen, Trung V.
Bonilla, Edwin V.

Journal Title

Journal ISSN

Volume Title

Publisher

Access Statement

Research Projects

Organizational Units

Journal Issue

Abstract

In multi-output regression applications the correlations between the response variables may vary with the input space and can be highly non-linear. Gaussian process regres- sion networks (GPRNs) are exible and effec- tive models to represent such complex adap- tive output dependencies. However, infer- ence in GPRNs is intractable. In this pa- per we propose two efficient variational infer- ence methods for GPRNs. The first method, gprn-mf, adopts a mean-field approach with full Gaussians over the GPRN's parameters as its factorizing distributions. The second method, gprn-npv, uses a nonparametric variational inference approach. We derive an- alytical forms for the evidence lower bound on both methods, which we use to learn the variational parameters and the hyper- parameters of the GPRN model. We ob- tain closed-form updates for the parameters of gprn-mf and show that, while having rel- atively complex approximate posterior dis- tributions, our approximate methods require the estimation of O(N) variational parame- ters rather than O(N2) for the parameters' covariances. Our experiments on real data sets show that gprn-npv may give a better approximation to the posterior distribution compared to gprn-mf, in terms of both pre- dictive performance and stability.

Description

Keywords

Citation

Source

Journal of Machine Learning Research

Book Title

Entity type

Publication

Access Statement

License Rights

DOI

Restricted until