Press "Enter" to skip to content

Automated Technology Allows Unparalleled Space Exploration from Moon, to Asteroids, and Beyond

Overhead view of OSIRIS-REx at sample site Nightingale, with a parking lot for comparison. Credit: NASA/Goddard/CI Lab/University of Arizona

When landing Apollo 11 in 1969, astronauts looked out the window for distinguishing features that they recognized from maps of the Moon and were able to steer the lander to avoid a disastrous touchdown on top of a rocky area. Now, 50 years later, the process can be automated. Distinguishing features, like known craters, boulders, or other unique surface characteristics, provide insight into surface hazards to help avoid them while landing.

NASA scientists and engineers are maturing technology for navigating and landing on planetary bodies by analyzing images during descent – a process called terrain relative navigation (TRN). This optical navigation technology is included on NASA’s newest Mars rover, Perseverance, which will test TRN when it lands on the Red Planet in 2021, paving the way for future crewed missions to the Moon and beyond. TRN was also being used during NASA’s recent Origins, Spectral Interpretation, Resources Identification, Security, Regolith Explorer (OSIRIS-REx) mission Touch-and-Go (TAG) event to collect samples of the asteroid Bennu in order to better understand the characteristics and movement of asteroids.

Since reaching Bennu in 2018, the OSIRIS-REx spacecraft has mapped and studied its surface, including its topography and lighting conditions, in preparation for TAG. Nightingale crater was chosen from four candidate sites based on its great amount of sampleable material and accessibility for the spacecraft.

[embedded content]
On October 20, the OSIRIS-REx spacecraft successfully navigated to asteroid Bennu’s surface and collected a sample. Credit: NASA’s Goddard Space Flight Center/Scientific Visualization Studio

Engineers routinely use ground-based optical navigation methods to navigate the OSIRIS-REx spacecraft close to Bennu, where new images taken by the spacecraft are compared to three-dimensional topographic maps. During TAG, OSIRIS-REx performed a similar optical navigation process onboard in real-time, using a TRN system called Natural Feature Tracking. Images were taken of the sample site during TAG descent, compared with onboard topographic maps, and the spacecraft trajectory was readjusted to target the landing site. Optical navigation could also be used in the future to minimize the risks associated with landing in other unfamiliar environments in our solar system.

NASA’s Lunar Reconnaissance Orbiter (LRO) has acquired images from orbit since 2009. LRO Project Scientist Noah Petro said one challenge to preparing for landed missions is the lack of high-resolution, narrow-angle camera images at every lighting condition for any specific landing site. These images would be useful for automated landing systems, which need the illumination data for a specific time of lunar day. However, NASA has been able to collect high-resolution topographic data using LRO’s Lunar Orbiter Laser Altimeter (LOLA).

“LOLA data, and other topographic data, let us take the shape of the Moon and shine a light on it for any time in the future or past, and with that we can predict what the surface will look like,” Petro said.

Artemis Astronaut on Moon

Artist concept of Artemis astronaut stepping onto the Moon. Credit: NASA

Using LOLA data, sun angles are overlaid on a three-dimensional elevation map to model shadows of surface features at specific dates and times. NASA scientists know the position and orientation of the Moon and LRO in space, having taken billions of lunar laser measurements. Over time, these measurements are compiled into a grid-map of the lunar surface. Images taken during landing are compared to this master map so that landers that may be used as part of the Artemis program have another tool to safely navigate the lunar terrain.

The lunar surface is like a fingerprint, Petro said, where no two landscapes are identical. Topography can be used to determine a spacecraft’s exact location above the Moon, comparing images like a forensic scientist compares fingerprints from crime scenes to match a known person to an unknown person – or to match a location to where the spacecraft is in its flight.

After landing, TRN can be used on the ground to help astronauts navigate crewed rovers. As part of NASA’s lunar surface sustainability concept, the agency is considering using a habitable mobility platform like an RV as well as a lunar terrain vehicle (LTV) to help crew travel on the lunar surface.

Astronauts can typically travel short distances of a few miles in an unpressurized rover like the LTV so long as they have landmarks to guide them. However, traveling greater distances is much more challenging, not to mention the Sun at the lunar South Pole is always low on the horizon, adding to visibility challenges. Driving across the South Pole would be like driving a car straight east first thing in the morning – the light can be blinding, and landmarks can appear distorted. With TRN, astronauts may be better able to navigate the South Pole despite the lighting conditions, as the computer may better detects hazards.

Speed is the key difference between using TRN to land a spacecraft and using it to navigate a crewed rover. Landing requires capturing and processing images faster, with as short as one second intervals between images. To bridge the gap between images, onboard processors keep the spacecraft on track to safely land.

“When you move slower – such as with rovers or OSIRIS-REx orbiting around the asteroid – you have more time to process the images,” said Carolina Restrepo, an aerospace engineer at NASA Goddard in Maryland working to improve current data products for the lunar surface. “When you are moving very fast – descent and landing – there is no time for this. You need to be taking images and processing them as fast as possible aboard the spacecraft and it needs to be all autonomous.”

Automated TRN solutions can address the needs of human and robotic explorers as they navigate unique locations in our solar system, such as the optical navigation challenges faced by OSIRIS-REx for TAG on Bennu’s rocky surface. Because of missions like LRO, Artemis astronauts can use TRN algorithms and lunar topography data to supplement images of the surface in order to land and safely explore the Moon’s South Pole.

“What we’re trying to do is anticipate the needs of future terrain relative navigation systems by combining existing data types to make sure we can build the highest-resolution maps for key locations along future trajectories and landing sites,” Restrepo said. “In other words, we need high-resolution maps both for scientific purposes as well as for navigation.”

Source: SciTechDaily