COMP 5212: Machine Learning [Fall 2023]

Wednesday, Friday 16:30-17:50 @ Room 6591 (Lift 31-32)

Overview

Announcements

[8 Sep 2023] Today’s lecture will be online over ZOOM due to the heavy rain.
[1 Sep 2023] Welcome to COMP5212!

Description

In this course, we will cover some classical and advanced algorithms in machine learning. Topics include: Linear models (linear/logistic regression, support vector machines), Non-linear models (tree-based methods, kernel methods, neural networks), learning theory (hypothesis space, bias/variance tradeoffs, VC dimensions). The course will also discuss some advanced topics of machine learning such as testing-time integrity in trustworthy machine learning and neural architecture search in AutoML.

Prerequisites

Basic knowledge in numerical linear algebra, probability, and calculus.

Grading Policy

Late submission policy:

Late submissions are accepted up to 2 days after the due date, with 10% (of the total grade of the item) penalty per day.

Term projects

Students will work on a open-topic research project with groups. Each group could only be consisted with less or equal than 4 members (<=4). Feel free to discuss with me offline for the topic choice.

Tentative Schedule and Material

Date Topic Slides Readings&links Assignments
Wed 6/9 Overview of Machine Learning lecture_0    
Fri 8/9 Math Basics lecture_1 Matrix Calculus:Derivation and Simple Application HU Pili, DL Chapter 2.1 & 2.2 &2.3  
Wed 13/9 Linear models lecture_2    
Fri 15/9 Optimization lecture_3 Convex Optimization Boyd and Vandenberghe Chapter 3.1, Numerical Optimization Nocedal and Wright Chapter 3.1  
Wed 20/9 Stochastic gradient descent and its variants lecture_4   Written_hw1 out
Fri 22/9 Support Vector Machine, Polynomail nonlinear mapping, Kernel method, lecture_5 Stanford CS 229 notes  
Wed 27/9 Polynomail nonlinear mapping, Kernel method lecture_6 Stanford CS 229 notes  
Fri 29/9 Learning theory lecture_7 Symmetrization  
Wed 4/10 Uniform convergence, growth function lecture_8 Bias/variance tradef off Programming_HW1 out
Fri 6/10 VC Dimension lecture_9    
Wed 11/10 Regularization lecture_10    
Fri 13/10 Tree-based methods lecture_11 Xgboost  
Wed 18/10 Neural networks lecture_12    
Fri 20/10 Neural networks for computer vision, Dropout, Batch Norm, ResNet lecture_13   Written_hw2 out
Wed 25/10 Word embedding, RNN, LSTM lecture_14    
Fri 27/10 Transformer lecture_15    
Wed 1/11 NLP Pretraining, prompt lecture_16    
Fri 3/11 Clustering lecture_17   Programming_HW2 out
Wed 8/11 Limitations of deep learning: adversarial machine learning lecture_18    
Fri 10/11 Semi-supervised learning, graph convolution network lecture_19 Graph laplacians  
Wed 15/11 Reinforcement Learning lecture_20 David Silver’s lecture  
Fri 17/11 AutoML(Neural architecture search) lecture_21    
Wed 22/11 Review lecture_22    
Fri 24/11 Final project presentation-part 1      
Wed 29/11 Final project presentation-part 2      

References

There is no required textbook for this course. Some recommended readings are