Multivariate Time Series Analysis with Deep Learning

PhD Thesis Proposal Defence


Title: "Multivariate Time Series Analysis with Deep Learning"

by

Mr. Shuhan ZHONG


Abstract:

Multivariate time series (MTS) serves as a fundamental data modality across
numerous science domains and real-world applications, such as meteorology,
transportation, healthcare, and finance. Based on the sampling regularity,
MTS can be categorized into regular (RMTS) and irregular (IMTS) types. The
analysis of MTS, such as forecasting and classification, requires deep
learning models to consider the characteristics of the task and capture the
complex temporal and inter-channel dependencies of the data. For IMTS,
handling the sampling irregularity presents an additional challenge. This
thesis aims to design such deep learning models for MTS analysis that achieve
both effectiveness (e.g., low forecasting error and high classification
accuracy) and efficiency (e.g., compact model size and fast running time).

For RMTS analysis, the primary challenge lies in modeling the complex
composition and multi-scale variations of the data. Addressing this
challenge, we propose MSD-Mixer, a multi-scale decomposition MLP-Mixer, which
learns to explicitly decompose the input RMTS into multi-scale temporal
patches for modeling and achieves new state-of-the-art performance across
multiple tasks including forecasting, classification, imputation, and anomaly
detection.

For IMTS classification, the key issue is the channel-wise asynchrony which
hampers the channel-wise modeling. Our proposed model MTM, a multi-scale
token mixing Transformer, addresses this via a dual strategy: multi-scale
down-sampling plus a proactive channel-wise token mixing mechanism, enabling
effective channel-wise modeling under the channel-wise asynchrony.

For IMTS forecasting, we observe that existing methods lack efficiency,
whereas the vanilla Mamba model, while demonstrating high efficiency, fails
to adapt to the irregular sampling of IMTS. To bridge this gap, we propose
AMI, an adaptive Mamba model which enhances the vanilla Mamba with
context-aware selectivity and query prompting, achieving superior forecasting
performance while retaining high computational efficiency.


Date:                   Thursday, 27 November 2025

Time:                   3:00pm - 5:00pm

Venue:                  Room 3494
                        Lift 25/26

Committee Members:      Prof. Gary Chan (Supervisor)
                        Prof. Xiaofang Zhou (Chairperson)
                        Prof. Andrew Horner