Learning in Flux: Continual and Efficient Representation Learning on Dynamic Graphs

The Hong Kong University of Science and Technology
Department of Computer Science and Engineering


PhD Thesis Defence


Title: "Learning in Flux: Continual and Efficient Representation Learning on 
Dynamic Graphs"

By

Mr. Peiyan ZHANG


Abstract:

The ability to learn and adapt within continuous data streams is a hallmark 
of true intelligence, yet it remains a grand challenge for modern artificial 
intelligence. Dynamic graphs offer a natural representation for these 
systems, yet prevailing graph learning models often fail to capture the 
temporal fidelity required for true, real-time adaptation. This creates a 
fundamental gap between existing analytical capabilities and the fluid 
reality of the environments we aim to model. This thesis confronts this 
challenge by proposing and validating a new methodology centered on the 
Dual-Fidelity framework. We posit that achieving genuine living models 
requires a simultaneous pursuit of fidelity along two orthogonal dimensions. 
The first is Data-Level Representation, where time should be treated as an 
intrinsic driver of state evolution rather than a mere external feature. The 
second is the Process-Level Paradigm, which requires proactive, continual 
learning mechanisms that explicitly combat catastrophic forgetting.

To materialize this framework, this thesis presents a sequence of technical 
contributions addressing each dimension. For the data-level dimension, we 
develop Graph Nested GRU-ODE (GNG-ODE), an architecture integrating Neural 
Ordinary Differential Equations into dynamic graph-based recommendation. It 
models user and item states as continuous trajectories governed by the 
precise time elapsed between events. For the process-level dimension, we 
design the Parameter-Isolation Graph Neural Network (PI-GNN). It introduces a 
mechanism to isolate stable and active parameters, which enables incremental 
adaptation to new data while mitigating catastrophic forgetting.

Finally, to ensure this high-fidelity methodology is not merely theoretical 
but practical for large-scale industrial settings, the thesis culminates in 
Graph Prompt Tuning for Recommender Systems (GPT4Rec). This paradigm resolves 
the critical tension between adaptation and efficiency by decoupling a large, 
frozen knowledge base from a small set of trainable prompts. It demonstrates 
that pursuing high fidelity is not only effective but also computationally 
feasible in demanding real-world environments.

In synthesis, this work provides a new analytical framework for dynamic graph 
learning, which offers a validated technical roadmap for achieving higher 
fidelity in both data representation and learning processes, and presents a 
scalable solution for its practical deployment. The research contributes to 
the development of AI systems capable of real-time reasoning and adaptation 
in evolving worlds.


Date:                   Friday, 19 September 2025

Time:                   3:00pm - 5:00pm

Venue:                  Room 5501
                        Lifts 25/26

Chairman:               Prof. Daniel PEREZ PALOMAR (ECE)

Committee Members:      Dr. Yangqiu SONG (Supervisor)
                        Dr. Shuai WANG
                        Dr. Dan XU
                        Prof. Jun ZHANG (ECE)
                        Dr. Xiangyu ZHAO (CityU)