COMP 5212: Machine Learning [Fall 2022]

Monday, Wednesday 12:00-13:20 @ Room 2503




In this course, we will cover some classical and advanced algorithms in machine learning. Topics include: Linear models (linear/logistic regression, support vector machines), Non-linear models (tree-based methods, kernel methods, neural networks), learning theory (hypothesis space, bias/variance tradeoffs, VC dimensions). The course will also discuss some advanced topics of machine learning such as testing-time integrity in trustworthy machine learning and neural architecture search in AutoML.


Basic knowledge in numerical linear algebra, probability, and calculus.

Grading Policy

Late submission policy:

Late submissions are accepted up to 2 days after the due date, with 10% (of the total grade of the item) penalty per day.

Term projects

Students will work on a open-topic research project with groups. Each group could only be consisted with less or equal than 4 members (<=4). Feel free to discuss with me offline for the topic choice.

Tentative Schedule and Material

Date Topic Slides Readings&links Assignments
Mon 5/9 Overview of Machine Learning lecture_0    
Wed 7/9 Math Basics lecture_1 Matrix Calculus:Derivation and Simple Application HU Pili, DL Chapter 2.1 & 2.2 &2.3  
Wed 14/9 Linear models lecture_2    
Mon 19/9 Optimization lecture_3 Convex Optimization Boyd and Vandenberghe Chapter 3.1, Numerical Optimization Nocedal and Wright Chapter 3.1  
Wed 21/9 Stochastic gradient descent and its variants lecture_4   Written_HW1 out
Mon 26/9 Support Vector Machine, Polynomail nonlinear mapping, Kernel method, lecture_5 Stanford CS 229 notes  
Wed 28/9 Learning theory lecture_6   Programming_HW1 out
Mon 3/10 Uniform convergence, growth function lecture_7 Symmetrization  
Wed 5/10 VC Dimension lecture_8    
Mon 10/10 Regularization lecture_9    
Wed 12/10 Clustering lecture_10    
Mon 17/10 Tree-based methods lecture_11 Xgboost  
Wed 19/10 Neural networks lecture_12   Written_HW2 out
Mon 24/10 Neural networks for computer vision lecture_13    
Wed 26/10 Dropout, Batch Norm, ResNet, Neural networks for NLP: basic lecture_14   Programming_HW2 out
Mon 31/10 Neural networks for NLP: Model lecture_15    
Wed 2/11 Transformer & Unsupervised pertaining for NLP lecture_16    
Mon 7/11 Vision Transformer lecture_17    
Wed 9/11 Semi-supervised learning, graph convolution network lecture_18   HW3 out
Mon 14/11 AutoML(Neural architecture search) lecture_19    
Wed 16/11 Recent progress in Neural architecture search lecture_20    
Mon 21/11 Limitations of deep learning: Testing-time integrity lecture_21    
Wed 23/11 Limitations of deep learning: Training-time integrity & Review lecture_22    
Mon 28/11 Final project presentation-part 1      
Wed 30/11 Final project presentation-part 2      


There is no required textbook for this course. Some recommended readings are