SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives

Date

Authors

Defazio, Aaron
Bach, Francis
Lacoste-Julien, Simon

Journal Title

Journal ISSN

Volume Title

Publisher

Access Statement

Research Projects

Organizational Units

Journal Issue

Abstract

In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.

Description

Keywords

Citation

Source

Advances in Neural Information Processing Systems

Book Title

Entity type

Publication

Access Statement

License Rights

DOI

Restricted until