More about HKUST
Safe, Scalable Transfer with Dynamic Experts: Mitigating Negative Transfer
PhD Thesis Proposal Defence
Title: "Safe, Scalable Transfer with Dynamic Experts: Mitigating Negative
Transfer"
by
Mr. Zhili LIU
Abstract:
Large-scale pre-training has become the dominant paradigm in modern machine
learning, enabling foundation models to transfer knowledge across diverse
downstream tasks. However, as training data becomes increasingly heterogeneous
and deployment contexts more open-ended, ensuring reliable and safe knowledge
transfer remains a fundamental challenge. This proposal studies these issues
under a unified perspective of negative transfer, where transferred knowledge
may either degrade task-specific performance or activate harmful latent
concepts during generation. We conceptualize effectiveness failures and safety
risks as two manifestations of unintended knowledge activation within shared
representations. To address these challenges, we develop scalable mitigation
strategies across multiple levels, including post-hoc suppression of harmful
concepts in generative models, routing-based mechanisms that reduce semantic
interference in representation learning, and expert-based specialization
frameworks that enable more structured knowledge allocation. Building on these
foundations, ongoing work further explores integrating reasoning structures
with mixture-of-experts architectures to enhance self-alignment in large
language models. Together, this research aims to establish a principled and
scalable framework for safe knowledge transfer in large-scale pre-training.
Date: Thursday, 12 March 2026
Time: 3:00pm - 5:00pm
Venue: Room 3494
Lift 25/26
Committee Members: Prof. James Kwok (Supervisor)
Dr. Dan Xu (Chairperson)
Dr. Long Chen