More about HKUST
Sparsity in Deep Learning: From the Efficiency and Generalization Perspective
PhD Thesis Proposal Defence
Title: "Sparsity in Deep Learning: From the Efficiency and Generalization
Perspective"
by
Mr. Xiao ZHOU
Abstract:
Sparsification is a natural idea to boost the inference and training efficiency
and generalization performance of neural networks. For inference efficiency, it
could work on a small sparse model with much less parameter counts and
computational time while preserving comparable or even better generalization
performance. For training efficiency, it works on a small sparse model with
constrained model size during the whole training process, with sparsified
forward and backward propagations. For generalization performance, besides the
already effective IID (Independently Identically Distributed) setting, we also
give a novel view of ultilizing sparsity to boost the generalizaton performance
in OOD (Out of Distribution) setting. We could also sparsity the dataset to
speed-up the training procedure and boost OOD performance.
Date: Monday, 16 October 2023
Time: 9:00am - 11:00am
Venue: Room 5510
lifts 25/26
Committee Members: Prof. Tong Zhang (Supervisor)
Prof. Ke Yi (Chairperson)
Dr. Qifeng Chen
Prof. Xiaofang Zhou
**** ALL are Welcome ****