More about HKUST
A SURVEY ON VISUALIZATION FOR EXPLAINABLE CLASSIFIERS
PhD Qualifying Examination
Title: "A SURVEY ON VISUALIZATION FOR EXPLAINABLE CLASSIFIERS"
by
Mr. Yao MING
Abstract:
Classification is a fundamental problem in machine learning, data mining
and computer vision. In practice, interpretability is a desirable property
of classification models (classifiers) in critical areas, such as
security, medicine and finance. For instance, a quantitative trader may
prefer a more interpretable model with less expected return due to its
predictability and low risk. Unfortunately, the best-performing
classifiers in many applications (e.g., deep neural networks) are complex
machines whose predictions are difficult to explain. Thus, there is a
growing interest in using visualization to understand, diagnose and
explain intelligent systems in both academia and in industry. Many
challenges need to be addressed in the formalization of explainability,
and the design principles and evaluation of explainable intelligent
systems.
The survey starts with an introduction to the concept and background of
explainable classifiers. Efforts towards more explainable classifiers are
categorized into two: designing classifiers with simpler structures that
can be easily understood; developing methods that generate explanations
for already complicated classifiers. Based on the life circle of a
classifier, we discuss the pioneering work of using visualization to
improve its explainability at different stages in the life circle. The
survey ends with a discussion about the challenges and future research
opportunities of explainable classifiers.
Date: Monday, 30 October 2017
Time: 10:00am - 12:00noon
Venue: Room 4472
Lifts 25/26
Committee Members: Prof. Huamin Qu (Supervisor)
Prof. James Kwok (Chairperson)
Dr. Yangqiu Song
Prof. Nevin Zhang
**** ALL are Welcome ****