More about HKUST
OMNIDIRECTIONAL PERCEPTION: ADVANCEMENTS IN ROBUST VISUAL LOCALIZATION AND MAPPING
PhD Thesis Proposal Defence Title: "OMNIDIRECTIONAL PERCEPTION: ADVANCEMENTS IN ROBUST VISUAL LOCALIZATION AND MAPPING" by Mr. Huajian HUANG Abstract: Effective scene perception forms the bedrock of robots and intelligent systems, enabling them to construct accurate maps of their surroundings and localize within them. Despite substantial progress, achieving robust localization and generating suitable environment representations for high-level tasks remains a challenge. In this thesis, we address these challenges through a series of innovative contributions. To counter the fragility of monocular SLAM systems, we introduce a recovery mechanism that treats pose estimator failures as stochastic processes. This mechanism allows for rapid system reinitialization and map reintegration, enhancing robustness in navigation scenarios without revisiting locations. Recognizing the limitations of traditional cameras' restricted fields of view, we harness the capabilities of 360-degree cameras to enable comprehensive perception. Our novel approach, 360VO, employs a single 360-degree camera for direct visual odometry, optimizing photometric residuals to achieve reliable localization and semi-dense point cloud mapping. Expanding beyond geometric mapping, we propose a method to enhance path planning and navigation by abstracting topological graphs that capture scene relationships. It is achieved by exploiting stable landmark co-visibility in omnidirectional images and estimating semantic coefficients to discern topological relationships. This approach seamlessly integrates into omnidirectional visual SLAM, offering computational efficiency. In the pursuit of real-time immersive exploration in indoor environments, we delve into omnidirectional radiance fields. Through exploiting the capacity of positional encoding and neural networks in a geometry-aware fashion, we heighten rendering speed and recover high-frequency details,With floorplan guidance, our system is capable of delivering an appealing and immersive indoor roaming experience. Furthermore, since robotic systems generally operate in diverse and dynamic environments, detecting and discarding dynamic elements could enhance system robustness. Therefore, we introduce an efficient long-term visual tracker that leverages cross-level feature correlation and adaptive tracking. Importantly, we proposed a novel omnidirectional tracking benchmark dataset, referred to as 360VOT. The assessment of prevailing tracking algorithms originally designed for perspective tracking underscores the challenges and opportunities within omnidirectional object tracking. In conclusion, this thesis advances visual localization and mapping methodologies, propelling the field toward omnidirectional perception. Our innovative approaches, centered around omnidirectional images, contribute to enhanced system performance while acknowledging emerging challenges. This pursuit of omnidirectional perception holds significant promise within the realms of computer vision and robotics. Date: Monday, 28 August 2023 Time: 3:00pm - 5:00pm Venue: Room 3494 lifts 25/26 Committee Members: Dr. Sai-Kit Yeung (Supervisor) Dr. Qifeng Chen (Chairperson) Dr. Tristan Braud Prof. Chi-Keung Tang **** ALL are Welcome ****