Representation learning of compositional data
Date
Authors
Avalos-Fernandez, Marta
Nock, Richard
Ong, Cheng Soon
Rouar, Julien
Sun, Ke
Journal Title
Journal ISSN
Volume Title
Publisher
Neural Information Processing Systems Foundation
Abstract
We consider the problem of learning a low dimensional representation for compositional data. Compositional data consists of a collection of nonnegative data that sum to a constant value. Since the parts of the collection are statistically dependent, many standard tools cannot be directly applied. Instead, compositional data must be first transformed before analysis. Focusing on principal component analysis (PCA), we propose an approach that allows low dimensional representation learning directly from the original data. Our approach combines the benefits of the log-ratio transformation from compositional data analysis and exponential family PCA. A key tool in its derivation is a generalization of the scaled Bregman theorem, that relates the perspective transform of a Bregman divergence to the Bregman divergence of a perspective transform and a remainder conformal divergence. Our proposed approach includes a convenient surrogate (upper bound) loss of the exponential family PCA which has an easy to optimize form. We also derive the corresponding form for nonlinear autoencoders. Experiments on simulated data and microbiome data show the promise of our method.
Description
Keywords
Citation
Collections
Source
Advances in Neural Information Processing Systems
Type
Book Title
Entity type
Access Statement
Free Access via publisher website
License Rights
DOI
Restricted until
2099-12-31