ANU Open Research Repository has been upgraded. We are still working out a few issues, and there may be periodic outages throughout the day. Please get in touch with repository.admin@anu.edu.au if you experience any issues.
 

Weakly supervised learning via statistical sufficiency

Date

2016

Authors

Patrini, Giorgio

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The Thesis introduces a novel algorithmic framework for weakly supervised learn- ing, namely, for any any problem in between supervised and unsupervised learning, from the labels standpoint. Weak supervision is the reality in many applications of machine learning where training is performed with partially missing, aggregated- level and/or noisy labels. The approach is grounded on the concept of statistical suf- ficiency and its transposition to loss functions. Our solution is problem-agnostic yet constructive as it boils down to a simple two-steps procedure. First, estimate a suffi- cient statistic for the labels from weak supervision. Second, plug the estimate into a (newly defined) linear-odd loss function and learn the model by any gradient-based solver, with a simple adaptation. We apply the same approach to several challeng- ing learning problems: (i) learning from label proportions, (ii) learning with noisy labels for both linear classifiers and deep neural networks, and (iii) learning from feature-wise distributed datasets where the entity matching function is unknown.

Description

Keywords

machine learning, weakly supervised learning, sufficient statistics, learning theory, noisy label, deep learning

Citation

Source

Type

Thesis (PhD)

Book Title

Entity type

Access Statement

License Rights

DOI

10.25911/5d723bc2607e3

Restricted until