More about HKUST
Grounding Foundation Models through Federated Transfer Learning: A General Framework
PhD Qualifying Examination
Title: "Grounding Foundation Models through Federated Transfer Learning: A
General Framework"
by
Mr. Tao FAN
Abstract:
Foundation Models (FMs) such as GPT-4 encoded with vast knowledge and powerful
emergent abilities have achieved remarkable success in various natural language
processing and computer vision tasks. Grounding FMs by adapting them to
domain-specific tasks or augmenting them with domain-specific knowledge enables
us to exploit the full potential of FMs. However, grounding FMs faces several
challenges, stemming primarily from constrained computing resources, data
privacy, model heterogeneity, and model ownership. Federated Transfer Learning
(FTL), the combination of federated learning and transfer learning, provides
promising solutions to address these challenges. Recently, the need for
grounding FMs leveraging FTL, coined FTL-FM, has arisen strongly in both
academia and industry.
Motivated by the strong growth in FTL-FM research and the potential impact of
FTL-FM on industrial applications, we propose an FTL-FM framework that
formulates problems of grounding FMs in the federated learning setting,
construct a detailed taxonomy based on the FTL-FM framework to categorize
state-of-the-art FTL-FM works, and comprehensively overview FTL-FM works based
on the proposed taxonomy. We also establish correspondence between FTL-FM and
conventional phases of adapting FM so that FM practitioners can align their
research works with FTL-FM. In addition, we overview advanced
efficiency-improving and privacy-preserving techniques because efficiency and
privacy are critical concerns in FTL-FM. Last, we discuss opportunities and
future research directions of FTL-FM.
Date: Wednesday, 23 October 2024
Time: 4:00pm - 6:00pm
Venue: Room 5501
Lifts 25/26
Committee Members: Prof. Qiang Yang (Supervisor)
Prof. Kai Chen (Co-supervisor)
Dr. Yangqiu Song (Chairperson)
Prof. Bo Li