More about HKUST
Causal-Driven Interpretation and Inspection of Deep Learning
PhD Thesis Proposal Defence
Title: "Causal-Driven Interpretation and Inspection of Deep Learning"
by
Mr. Zhenlan JI
Abstract:
Deep learning systems have achieved remarkable success across various
domains, yet their black-box nature and inherent complexity present
significant challenges to interpretability, reliability, and software
quality assurance. To address these challenges, this thesis leverages
causality, a canonical analysis framework for understanding complex systems,
to enhance the interpretability and inspection of deep learning systems.
This thesis establishes causality-based methodologies that offer a
principled and scalable foundation for the interpretation and analysis of
deep learning systems across various aspects and granularity levels, from
neuron-level interactions to system-level evaluations.
Specifically, this thesis contains three key contributions. First, it
introduces Causal Coverage (CC), the first causality-aware test coverage
criterion for deep neural networks, which formalizes neuron interactions and
quantifies test adequacy based on uncovered causal dependencies. Second, it
proposes a causal analysis framework for fairness trade-offs in machine
learning pipelines, enabling systematic identification and quantification of
causal trade-offs among fairness, robustness, and performance metrics.
Finally, it develops a causal evaluation approach for large language model
(LLM)-based code generation, constructing causal graphs over prompt and code
features to explain and improve LLM outputs through counterfactual analysis.
Date: Friday, 4 July 2025
Time: 2:00pm - 4:00pm
Venue: Room 5501
Lifts 25/26
Committee Members: Dr. Shuai Wang (Supervisor)
Prof. Nevin Zhang (Chairperson)
Dr. Dan Xu