Representation learning of compositional data

dc.contributor.authorAvalos-Fernandez, Marta
dc.contributor.authorNock, Richard
dc.contributor.authorOng, Cheng Soon
dc.contributor.authorRouar, Julien
dc.contributor.authorSun, Ke
dc.contributor.editorK Grauman
dc.contributor.editorN Cesa-Bianchi
dc.contributor.editorR Garnett
dc.contributor.editorS Bengio
dc.contributor.editorH Larochelle
dc.contributor.editorH Wallach
dc.coverage.spatialMontreal, Canada
dc.date.accessioned2024-02-12T23:39:36Z
dc.date.createdDecember 2-8 2018
dc.date.issued2018
dc.date.updated2022-10-02T07:19:31Z
dc.description.abstractWe consider the problem of learning a low dimensional representation for compositional data. Compositional data consists of a collection of nonnegative data that sum to a constant value. Since the parts of the collection are statistically dependent, many standard tools cannot be directly applied. Instead, compositional data must be first transformed before analysis. Focusing on principal component analysis (PCA), we propose an approach that allows low dimensional representation learning directly from the original data. Our approach combines the benefits of the log-ratio transformation from compositional data analysis and exponential family PCA. A key tool in its derivation is a generalization of the scaled Bregman theorem, that relates the perspective transform of a Bregman divergence to the Bregman divergence of a perspective transform and a remainder conformal divergence. Our proposed approach includes a convenient surrogate (upper bound) loss of the exponential family PCA which has an easy to optimize form. We also derive the corresponding form for nonlinear autoencoders. Experiments on simulated data and microbiome data show the promise of our method.en_AU
dc.format.mimetypeapplication/pdfen_AU
dc.identifier.urihttp://hdl.handle.net/1885/313431
dc.language.isoen_AUen_AU
dc.publisherNeural Information Processing Systems Foundationen_AU
dc.relation.ispartofseries32nd Conference on Neural Information Processing Systems, NeurIPS 2018en_AU
dc.rights© 2018 Neural Information Processing Systems Foundationen_AU
dc.sourceAdvances in Neural Information Processing Systemsen_AU
dc.source.urihttps://proceedings.neurips.cc/paper_files/paper/2018/hash/664dd858db942cad06f24ff25df56716-Abstract.htmlen_AU
dc.titleRepresentation learning of compositional dataen_AU
dc.typeConference paperen_AU
dcterms.accessRightsFree Access via publisher websiteen_AU
local.bibliographicCitation.lastpage6689en_AU
local.bibliographicCitation.startpage6679en_AU
local.contributor.affiliationAvalos-Fernandez, Marta, Université de Bordeauxen_AU
local.contributor.affiliationNock, Richard, College of Engineering and Computer Science, ANUen_AU
local.contributor.affiliationOng, Cheng Soon, College of Engineering and Computer Science, ANUen_AU
local.contributor.affiliationRouar, Julien, Université de Bordeauxen_AU
local.contributor.affiliationSun, Ke, Data61en_AU
local.contributor.authoruidNock, Richard, u5647716en_AU
local.contributor.authoruidOng, Cheng Soon, u4028825en_AU
local.description.embargo2099-12-31
local.description.notesImported from ARIESen_AU
local.description.refereedYes
local.identifier.absfor461199 - Machine learning not elsewhere classifieden_AU
local.identifier.ariespublicationu3102795xPUB1760en_AU
local.identifier.scopusID2-s2.0-85064842415
local.publisher.urlhttps://proceedings.neurips.cc/paper_files/paper/2018/hash/664dd858db942cad06f24ff25df56716-Abstract.htmlen_AU
local.type.statusPublished Versionen_AU

Downloads

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
NeurIPS-2018-representation-learning-of-compositional-data-Paper.pdf
Size:
757.33 KB
Format:
Adobe Portable Document Format
Description: