Nguyen, Trung V.Bonilla, Edwin V.2025-12-172025-12-171532-4435https://hdl.handle.net/1885/733795910In multi-output regression applications the correlations between the response variables may vary with the input space and can be highly non-linear. Gaussian process regres- sion networks (GPRNs) are exible and effec- tive models to represent such complex adap- tive output dependencies. However, infer- ence in GPRNs is intractable. In this pa- per we propose two efficient variational infer- ence methods for GPRNs. The first method, gprn-mf, adopts a mean-field approach with full Gaussians over the GPRN's parameters as its factorizing distributions. The second method, gprn-npv, uses a nonparametric variational inference approach. We derive an- alytical forms for the evidence lower bound on both methods, which we use to learn the variational parameters and the hyper- parameters of the GPRN model. We ob- tain closed-form updates for the parameters of gprn-mf and show that, while having rel- atively complex approximate posterior dis- tributions, our approximate methods require the estimation of O(N) variational parame- ters rather than O(N2) for the parameters' covariances. Our experiments on real data sets show that gprn-npv may give a better approximation to the posterior distribution compared to gprn-mf, in terms of both pre- dictive performance and stability.9enPublisher Copyright: Copyright 2013 by the authors.Efficient variational inference for Gaussian process regression networks201384954199496