Stochastic Alternating Direction Method of Multipliers

MPhil Thesis Defence


Title: "Stochastic Alternating Direction Method of Multipliers"

By

Mr. Shuai ZHENG


Abstract

The alternating direction method of multipliers (ADMM) is an efficient 
optimization solver for a wide variety of machine learning models. 
Recently, stochastic ADMM has been integrated with variance reduction 
methods for stochastic gradient, leading to the SAG-ADMM and SDCA-ADMM 
algorithms that have fast convergence rates and low iteration 
complexities. However, their space requirements can still be high, 
particularly when used in large multiclass, multilabel or multitask 
learning problems. In this thesis, I propose a novel integration of ADMM 
with the method of stochastic variance reduced gradient (SVRG). It retains 
the benefits of SAG-ADMM and SDCA-ADMM, but is more advantageous in that 
its storage requirement is very low, even independent of the sample size 
n. Experimental results demonstrate that it is as fast as SAG-ADMM and 
SDCA-ADMM, but can be used on much bigger data sets.


Date:			Wednesday, 10 June 2015

Time:			3:00pm - 5:00pm

Venue:			Room 4483
 			Lifts 25-26

Committee Members:	Prof. James Kwok (Supervisor)
 			Dr. Brian Mak (Chairperson)
 			Dr. Raymond Wong


**** ALL are Welcome ****