Towards Resource-Efficient Natural Language Processing

Speaker: Dr. Junxian HE
         Shanghai Jiao Tong University

Title:  "Towards Resource-Efficient Natural Language Processing"

Date:   Thursday, 2 March 2023

Time:   10:00am  - 11:00am HKT

Zoom Link:
https://hkust.zoom.us/j/465698645?pwd=aVRaNWs2RHNFcXpnWGlkR05wTTk3UT09

Meeting ID: 465-698-645
Passcode: 20222023


Abstract:

As large-scale pretraining becomes the de-facto standard in NLP, enormous
training data and model parameters consistently lead to state-of-the-art
performance on various NLP tasks. While quite successful, current NLP
approaches often cost lots of (scarce) resources such as data labels,
hardware, and time, which prohibits their usage in broader and practical
settings. In this talk, I will cover our efforts towards
resource-efficient NLP. Specifically, I will discuss (1) a structured
latent-variable model for unsupervised language analysis; (2) a unified
framework for parameter-efficient tuning; and (3) an efficient variant of
nearest-neighbor language models. In the last part, I will briefly
introduce our recent work on adapting large language models and vision of
several future directions.


******************
Biography:

Junxian He is a tenure-track assistant professor at John Hopcroft Center
for Computer Science in Shanghai Jiao Tong University. He obtained his PhD
degree in natural language processing from Carnegie Mellon University,
Language Technologies Institute in 2022 summer. Before that, he received
the bachelor degree from Shanghai Jiao Tong University in 2017. His
research focuses on deep generative models, resource-efficient methods, as
well as the adaptation of large language models. He served as the area
chair for ACL and EMNLP. His work has been recognized by the Baidu PhD
fellowship.