Lockheed Martin Space, headquartered in Denver, Colorado, employs over 16,000 people to develop satellite systems, spacecraft, and space probes.
Recently, in a partnership with General Motors, Lockheed Martin Space is endeavoring to develop a transportation solution for NASA's upcoming Artemis mission in the form of the Lunar Mobility Vehicle (LMV).
In the LMV's navigation of the lunar environment, an accurate sensor system is essential to ensure safe traversal. Such a sensor system has different system requirements than a typical autonomous vehicle in that it must be redundant, performant, and capable of operating in the lunar domain.
Our LiDAR and Image Fusion Solution for Autonomous Navigation enables the LMV to detect the lunar environment by providing a lightweight and redundant software solution capable of running on the vehicle's onboard systems. By utilizing both LiDAR data and stereo-camera data, our solution accurately detects its environment. Additionally, our robust system can maintain effective operation in scenarios where a sensor fails through a redundant imaging setup.
Given the nature of the LMV, our system exists in an environment with limited computer systems and resources. Optimizing our fusion solution and underlying detection software guarantees its functionality in this restrictive environment.
In practice, our system is a network of ROS nodes, each node being a discrete computation module. Our LiDAR and stereo-image data are fused into a singular point cloud in the system and then fed through a YoloV4-based 3D object detection model for inference using the ONNX runtime. The entire system runs on an NVIDIA Jetson TX2, with Imaging-Source cameras and Intel Realsense LiDAR sensors providing data.