More about HKUST
Design and Analysis of Natural Hand-Gesture Interfaces for Extended Realities
PhD Thesis Proposal Defence
Title: "Design and Analysis of Natural Hand-Gesture Interfaces for Extended
Realities"
by
Mr. Kirill SHATILOV
Abstract:
Augmented, Mixed, and Virtual Realities, which are collectively referred to
as Extended Realities (XR), present significant challenges to conventional
input and output methods. Traditional interfaces such as touch screens,
keyboards, pointers, and controllers often fall short of meeting the
requirements for interacting with both real and virtual objects. They also
struggle to provide the necessary mobility, efficiency, and social
acceptance that users demand. In this thesis, we aim to design and evaluate
innovative natural hand-gesture interfaces tailored for extended realities.
To begin with, we explore the use of surface electromyography (sEMG), a
non-intrusive wearable imaging technique, for recognizing gestures. sEMG
works by detecting electrical signals generated by muscles through the
skin's surface, which can then be interpreted to identify performed or
intended gestures.
In our initial study, we developed and assessed a gesture recognition system
intended for controlling a 3D-printed prosthetic hand. Our primary focus is
on the mobility of the solution, where deep learning algorithms for gesture
recognition are executed on a mobile device, with the option to offload
computations to a remote server. In this study, we design the prosthesis,
which includes the necessary electronics and a modified chassis, as well as
a mobile companion device and communication protocols that link the system's
components. Our findings demonstrate that a low-cost, mobile-centered system
can achieve state-of-the-art accuracy in gesture recognition while
maintaining reliable operation over extended periods. We evaluate the
accuracy of gesture recognition across various gesture sets and assess the
system's response time and power consumption.
Building on the insights gained from our first study, we adapt our gesture
recognition system to function as a virtual keyboard for extended realities.
In this system, users select keys on a standard QWERTY keyboard by orienting
their forearm in space to choose a column (measured by an IMU sensor) and
then selecting a specific key within that column through a directional
gesture (utilizing the sEMG sensor). Notably, we consider three different
usage scenarios for our proposed solution: one with an empty hand, and two
with a busy hand — such as when holding an umbrella (cylindrical grasp) or a
pen (tripod grasp). This approach results in an input system for extended
realities that offers an uninterrupted experience for subtle and even
discreet text entry. We conduct experiments involving a dozen participants
to evaluate text input rates, identify common errors, and assess the overall
usability of our proposed text input system.
Date: Thursday, 29 May 2025
Time: 2:00pm - 4:00pm
Venue: Room 2128C
Lift 19
Committee Members: Prof. Pan Hui (Supervisor, EMIA)
Dr. Tristan Braud (Co-supervisor)
Prof. Pedro Sander (Chairperson)
Prof. Gary Chan