Skip navigation
Skip navigation

Weakly supervised learning via statistical sufficiency

Patrini, Giorgio

Description

The Thesis introduces a novel algorithmic framework for weakly supervised learn- ing, namely, for any any problem in between supervised and unsupervised learning, from the labels standpoint. Weak supervision is the reality in many applications of machine learning where training is performed with partially missing, aggregated- level and/or noisy labels. The approach is grounded on the concept of statistical suf- ficiency and its transposition to loss functions....[Show more]

dc.contributor.authorPatrini, Giorgio
dc.date.accessioned2017-05-29T01:57:23Z
dc.date.available2017-05-29T01:57:23Z
dc.identifier.otherb43751945
dc.identifier.urihttp://hdl.handle.net/1885/117067
dc.description.abstractThe Thesis introduces a novel algorithmic framework for weakly supervised learn- ing, namely, for any any problem in between supervised and unsupervised learning, from the labels standpoint. Weak supervision is the reality in many applications of machine learning where training is performed with partially missing, aggregated- level and/or noisy labels. The approach is grounded on the concept of statistical suf- ficiency and its transposition to loss functions. Our solution is problem-agnostic yet constructive as it boils down to a simple two-steps procedure. First, estimate a suffi- cient statistic for the labels from weak supervision. Second, plug the estimate into a (newly defined) linear-odd loss function and learn the model by any gradient-based solver, with a simple adaptation. We apply the same approach to several challeng- ing learning problems: (i) learning from label proportions, (ii) learning with noisy labels for both linear classifiers and deep neural networks, and (iii) learning from feature-wise distributed datasets where the entity matching function is unknown.
dc.language.isoen
dc.subjectmachine learning
dc.subjectweakly supervised learning
dc.subjectsufficient statistics
dc.subjectlearning theory
dc.subjectnoisy label
dc.subjectdeep learning
dc.titleWeakly supervised learning via statistical sufficiency
dc.typeThesis (PhD)
local.contributor.supervisorNock, Richard
local.contributor.supervisorcontactrichard.nock@data61.csiro.au
dcterms.valid2017
local.description.notesthe author deposited 29/05/17
local.type.degreeDoctor of Philosophy (PhD)
dc.date.issued2016
local.contributor.affiliationANU College of Engineering & Computer Science, The Australian National University
local.identifier.doi10.25911/5d723bc2607e3
local.mintdoimint
CollectionsOpen Access Theses

Download

File Description SizeFormat Image
Patrini Thesis 2017.pdf4.63 MBAdobe PDFThumbnail


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  17 November 2022/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator