Home | People | Research | Publications | Software | Links | Youtube Channel | News | Contact   

AstroSLAM: Robust Visual Localization in Orbit

Space robotics technology is maturing fast enough that it is time to start thinking how to utilize this
technology to support routine robotic operations in space in the not-so-distant future. Recent technological
breakthroughs for ground robots, including perception and planning algorithms, machine learning based pattern recognition, autonomy, new computer hardware architectures (GPU, ASIC, FPGA), human-machine interfaces, and dexterous manipulation, among many others, pave the way for similar advancements in the area of robotic on-orbit operations to support failure mitigation, large flexible structure assembly, on-orbit debris removal, inspection, hardware upgrades, etc. Owing to the harsh environmental conditions in space, only robots with increased robustness and high levels of autonomy can perform these missions.

Key Research Objectives

This research will develop a novel visual perception, localization, mapping, and planning algorithms that will enable new capabilities in terms of situational awareness for space robots that can work alone or alongside astronauts in orbit as “co-robots”. The proposed research plan will develop novel automated feature extraction and matching algorithms adapted to the challenging imaging conditions and motion constraints imposed in space, so as to enable robust and reliable relative pose estimation, 3D shape reconstruction and characterization of space objects. We will develop innovative optimal planning and prediction methods matched to these new perception capabilities that also
account for fuel usage and orbital motion constraints. The final outcome will be the ability of astronauts and
space robots work together to enable:
– inspection, monitoring, and classification of resident space objects (RSOs);
– maneuvering and proximity operations and docking, including salvage and retrieval of malfunctioning
or tumbling spacecraft;
– servicing, construction, repair, upgrade, and refueling missions of space assets in orbit.

Deep NN Architectures for Automated Feature Detection and Matching in Space

Investigation of novel image features that maximally match across multiple camera views, that demonstrate high
repeatability, robustness to large change in viewing angle with respect to the RSO, and adaptation to a highly
collimated light source (e.g., the Sun), with a lack of typical atmospheric scattering. These features will be
automatically generated using deep neural network (DNN) architectures. This task will also exploit surface
reflection models to extract the relevant information from images for successful feature extraction.

 Full 4D Situational Awareness (4DSA) for Robots and Humans in Space

Provide space-time situational awareness for astronauts in orbit by combining multiple camera streams using
a factor-graph optimization framework in order to generate pose and predicted trajectories for all objects in the visual field. SLAM processes “on the edge” will produce 3D representations used to match nearby objects. We will make use of strong motion priors to enhance robustness and reduce computational burden.

Multi-platform Kinodynamic Motion Planning for 4DSA

In order to provide continual 4D situational awareness to astronauts we will develop methods to optimally move all agents involved to the evolving task-needs. As input we assume the information from the combined camera streams,
as well as additional task-specific visibility and fuel objectives to be satisfied. We will develop algorithms to provide finite-horizon plans for all platforms, satisfying task objectives as well as avoiding collisions.

Experimental Validation

The theoretical results will be validated using state-of-the-art ASTROS (Autonomous Spacecraft Testing of Robotic Operations in Space) and COSMOS (COntrol and Simulation of Multi-Spacecraft Operations in Space) experimental platforms at Georgia Tech that allow the simulation of realistic translational and rotational dynamics of spacecraft in 1-g environment.

Selected Publications

Sponsor

This research has been sponsored by NSF.

Contact

For more information contact Mehregan Dor.


 
Home | People | Research | Publications | Software | Links | News | Contact