Skip navigation
Skip navigation

Accelerated training of conditional random fields with stochastic gradient methods

Vishwanathan, S; Schraudolph, Nicol; Schmidt, Mark W.; Murphy, Keven P.


We apply Stochastic Meta-Descent (SMD), a stochastic gradient optimization method with gain vector adaptation, to the training of Conditional Random Fields (CRFs). On several large data sets, the resulting optimizer converges to the same quality of solution over an order of magnitude faster than limited-memory BFGS, the leading method reported to date. We report results for both exact and inexact inference techniques.

CollectionsANU Research Publications
Date published: 2006
Type: Conference paper
Source: Proceedings of 23rd International Conference of Machine Learning


File Description SizeFormat Image
01_Vishwanathan_Accelerated_training_of_2006.pdf112.75 kBAdobe PDF    Request a copy
02_Vishwanathan_Accelerated_training_of_2006.pdf249.57 kBAdobe PDF    Request a copy
03_Vishwanathan_Accelerated_training_of_2006.pdf1.58 MBAdobe PDF    Request a copy

Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  20 July 2017/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator