Statistical Methods in Machine Learning Theory

Speaker: Dr. Zihan ZHANG
Assistant Professor
Department of Computer Science and Engineering
the Hong Kong University of Science and Technology

Title: Statistical Methods in Machine Learning Theory

Date: Monday, 17 November 2025

Time: 4:00pm - 5:00pm

Venue: Lecture Theater F
(Leung Yat Sing Lecture Theater), near lift 25/26, HKUST

Abstract:

The success of machine learning hinges on a model's ability to generalize from finite data to unseen examples. This talk will unpack the statistical foundations that make this possible. We will explore how core concepts like uniform convergence, model complexity (e.g., VC dimension and Rademacher complexities), and concentration inequalities provide rigorous performance guarantees for learning algorithms. In this talk, we will examine two compelling case studies: multidistribution learning, where these principles ensure robustness across diverse data environments, and reinforcement learning, where they underpin guarantees for policy optimization and value estimation. By understanding these statistical foundations, we can better design, analyze, and trust the machine learning systems that are shaping our world.


Biography:

Dr. Zihan Zhang is an Assistant Professor at the Department of Computer Science and Engineering (CSE), the Hong Kong University of Science and Technology (HKUST). Previously he was a postdoc researcher at Paul G. Allen School of CSE, University of Washington and Department of ECE, Princeton University. He obtained Ph.D and bachelor degree from Department of Automation, Tsinghua University, respectively at 2022 and 2017.

His research mainly focuses on machine learning theory, including reinforcement learning, online learning and game theory. He has published in top journals and conferences like JACM, COLT, NeurIPS, ICML and ICLR.