More about HKUST
Efficient Sparse Modeling with Structured Regularization
PhD Thesis Proposal Defence
Title: "Efficient Sparse Modeling with Structured Regularization"
by
Mr. Wenliang ZHONG
Abstract:
Modern data arising from various domains, such audio, image, text and
microarray data, are often high-dimension and contain spurious features with
various structures. In most cases, a simple model learned from data is at a
more favorable side than complicated ones, since it can often provide better
generalization performance, together with intuitive interpretation.
Beside pure sparsity induced by l1-norm, sophisticated
structured-sparsity-inducing regularizers are highly desirable when the
features have some intrinsic structures. In this proposal, we address three
aspects of sparse modeling:
1. Hierarchy feature selection: Hierarchical and structural relationships among
features are often used to constrain the search for the more important
interactions. We propose the use of the alternating direction method of
multipliers (ADMM) and accelerated gradient methods. In particular, we show
that ADMM can be used to either directly solve the problem or serve as a key
building block.
2. Automatic cluster discovery: Traditional multitask learning (MTL) are
limited to modeling these relationships at the task level, which may be
restrictive in some applications. We propose a novel MTL formulation that
captures task relationships at the feature-level. Depending on the interactions
among tasks and features, the proposed method construct different task clusters
for different features. Computationally, the proposed formulation is convex,
and can be efficiently solved by accelerated gradient methods.
3. Nonconvex regularization: Nonconvex regularizers can outperform their
convex counterparts in many situations. However, the resulting nonconvex
optimization problems are often challenging. By using a recent mathematical
tool known as the proximal average, we propose a novel proximal gradient
descent method for optimization with a wide class of nonconvex and composite
regularizers. The simple strategy has guaranteed convergence and low
per-iteration complexity.
Experimental results on a number of synthetic and real-world data sets
demonstrate that the proposed algorithms are efficient and flexible. Moreover,
the use of the novel models consistently improves generalization performance
and parameter estimation.
Date: Wedneday, 11 June 2014
Time: 3:00pm - 5:00pm
Venue: Room 4472
lifts 25/26
Committee Members: Prof. James Kwok (Supervisor)
Dr. Brian Mak (Chairperson)
Dr. Raymond Wong
Prof. Dit-Yan Yeung
**** ALL are Welcome ****