More about HKUST
A Survey of Sparse Neural Networks: From the Efficiency and Generalization Perspective
PhD Qualifying Examination Title: "A Survey of Sparse Neural Networks: From the Efficiency and Generalization Perspective" by Mr. Xiao ZHOU Abstract: Sparsification is a natural idea to boost the inference and training efficiency and generalization performance of neural networks. For inference efficiency, it could work on a small sparse model with much less parameter counts and computational time while preserving comparable or even better generalization performance. For training efficiency, it works on a small sparse model with constrained model size during the whole training process, with sparsified forward and backward propagations. For generalization performance, besides the already effective IID (Independently Identically Distributed) setting, we also give a novel view of ultilizing sparsity to boost the generalizaton performance in OOD (Out of Distribution) setting. The variety of sparsification is enumerous and we will deliberate it to present a comprehensive view, including weight-level sparsification, channel-level sparsification, activation sparsification and etc. The variety of training efficiency is also enumerous and we will make a comprehensive summarization including weight-level and channel-level sparsification, and discuss previous limitations. The discussion of ultilizing sparsification to boosting generalization performance is little and we give a novel view of ultilizing sparisification in boosting generalization performance. We give our own contribution to the variational weight-level sparsification for inference efficienty to deal with gradient vanishing problem and training and testing performance discrepency. We give our own contribution to channel-level sparsification for training efficiency to propose the first effective solution to make sparse channel-level backward propagation realistic. We give our own contribution to OOD generalization to highly boost the testing performanace, especially in the IRM setting. Date: Tuesday, 28 December 2021 Time: 9:00am - 11:00am Zoom Meeting: https://hkust.zoom.us/j/94308389180?pwd=VTJWd2lzU1lBNzM1M3lDc0ljWjhrQT09 Committee Members: Prof. Tong Zhang (Supervisor) Dr. Qifeng Chen (Chairperson) Prof. Raymond Wong Prof. Ke Yi Dr. Zhenguo Li (Huawei Noah's Ark Lab) **** ALL are Welcome ****