Discrete Diffusion Language Models: Standalone Alternatives and Complementary Drafts for Autoregressive Decoding

PhD Qualifying Examination


Title: "Discrete Diffusion Language Models: Standalone Alternatives and
Complementary Drafts for Autoregressive Decoding"

by

Mr. Guangxin HE


Abstract:

Discrete diffusion language models (DLLMs) have emerged as a compelling
alternative to the dominant autoregressive (AR) paradigm for text
generation. By iteratively denoising a fully masked sequence with
bidirectional attention, DLLMs enable parallel token generation, rich
contextual conditioning, and flexible forms of conditional generation and
editing. This survey reviews recent advances in DLLMs, with a particular
focus on the technical challenges that shape their scalability, efficiency,
and use alongside autoregressive models. We organize the discussion along
three complementary dimensions. First, we survey long-context scaling
techniques that extend DLLM context windows to 128K tokens and beyond,
enabling applications such as long-document understanding. Second, we examine
inference acceleration methods, including KV management, sparse attention,
and quantization, that address the quadratic computational cost exacerbated
by longer contexts. Third, we discuss how DLLMs can collaborate with AR
models through speculative decoding, where DLLMs act as parallel draft models
to accelerate AR inference. Throughout, we focus exclusively on discrete
diffusion approaches, which have enabled many of the strongest recent
practical results in this area. We highlight key findings, analyze
trade-offs across different approaches, and identify open challenges and
future directions for making DLLMs more practical both as standalone models
and as complementary components alongside AR models.


Date:                   Monday, 28 April 2026

Time:                   3:00pm - 5:00pm

Venue:                  Room 2131B
                        Lift 22

Committee Members:      Dr. Binhang Yuan (Supervisor)
                        Dr. Chaojian Li (Chairperson)
                        Dr. Qifeng Chen