Jump to content

NASA Improves GIANT Optical Navigation Technology for Future Missions


Recommended Posts

  • Publishers
Posted

As NASA scientists study the returned fragments of asteroid Bennu, the team that helped navigate the mission on its journey refines their technology for potential use in future robotic and crewed missions.

The optical navigation team at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, served as a backup navigation resource for the OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, and Security – Regolith Explorer) mission to near-Earth asteroid Bennu. They double-checked the primary navigation team’s work and proved the viability of navigation by visual cues.

The sample return capsule from NASAs OSIRIS-REx mission is seen shortly after touching down in the desert, Sunday, Sept. 24, 2023, at the Department of Defense's Utah Test and Training Range.
The sample return capsule from NASA’s OSIRIS-REx mission is seen shortly after touching down in the desert, Sunday, Sept. 24, 2023, at the Department of Defense’s Utah Test and Training Range. The sample was collected from the asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft.
NASA/Keegan Barber

Optical navigation uses observations from cameras, lidar, or other sensors to navigate the way humans do. This works by taking pictures of a target, such as Bennu, and identifying landmarks on the surface. GIANT software – that’s short for the Goddard Image Analysis and Navigation Tool – analyzes those images to provide information, such as precise distance to the target, and to develop three-dimensional maps of potential landing zones and hazards. It can also analyze a spinning object to help calculate the target’s mass and determine its center – critical details to know for a mission trying to enter an orbit.

“Onboard autonomous optical navigation is an enabling technology for current and future mission ideas and proposals,” said Andrew Liounis, lead developer for GIANT at Goddard. “It reduces the amount of data that needs to be downlinked to Earth, reducing the cost of communications for smaller missions, and allowing for more science data to be downlinked for larger missions. It also reduces the number of people required to perform orbit determination and navigation on the ground.”

Asteroid Bennu ejecting particles from its surface on Jan. 19, created by combining two images from NASA's OSIRIS-REx spacecraft processed by optical navigation technology
Asteroid Bennu ejecting particles from its surface on Jan. 19, created by combining two images from NASA’s OSIRIS-REx spacecraft. GIANT optical navigation technology used to process images like these helped establish the size and velocity of the particles.
NASA / Goddard / University of Arizona

During OSIRIS-REx’s orbit of Bennu, GIANT identified particles flung from the asteroid’s surface. The optical navigation team used images to calculate the particles’ movement and mass, ultimately helping determine they did not pose a significant threat to the spacecraft.

Since then, lead developer Andrew Liounis said they have refined and expanded GIANT’s backbone collection of software utilities and scripts.

New GIANT developments include an open-source version of their software released to the public, and celestial navigation for deep space travel by observing stars, the Sun, and solar system objects. They are now working on a slimmed-down package to aid in autonomous operations throughout a mission’s life cycle.

“We’re also looking to use GIANT to process some Cassini data with partners at the University of Maryland in order to study Saturn’s interactions with its moons,” Liounis said.

Other innovators like Goddard engineer Alvin Yew are adapting the software to potentially aid rovers and human explorers on the surface of the Moon or other planets.

Adaptation, Improvement

Shortly after OSIRIS-REx left Bennu, Liounis’ team released a refined, open-source version for public use. “We considered a lot of changes to make it easier for the user and a few changes to make it run more efficiently,” he said.

An intern modified their code to make use of a graphics processor for ground-based operations, boosting the image processing at the heart of GIANT’s navigation.

A simplified version called cGIANT works with Goddard’s autonomous Navigation, Guidance, and Control software package, or autoNGC in ways that can be crucial to both small and large missions, Liounis said.

Liounis and colleague Chris Gnam developed a celestial navigation capability which uses GIANT to steer a spacecraft by processing images of stars, planets, asteroids, and even the Sun. Traditional deep space navigation uses the mission’s radio signals to determine location, velocity, and distance from Earth. Reducing a mission’s reliance on NASA’s Deep Space Network frees up a valuable resource shared by many ongoing missions, Gnam said.

Next on their agenda, the team hopes to develop planning capabilities so mission controllers can develop flight trajectories and orbits within GIANT – streamlining mission design.

“On OSIRIS-REx, it would take up to three months to plan our next trajectory or orbit,” Liounis said. “Now we can reduce that to a week or so of computer processing time.”

Their innovations have earned the team continuous support from Goddard’s Internal Research and Development program, individual missions, and NASA’s Space Communications and Navigation program.

“As mission concepts become more advanced,” Liounis said, “optical navigation will continue to become a necessary component of the navigation toolbox.”

By Karl B. Hille

NASA’s Goddard Space Flight Center in Greenbelt, Md. 

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By USH
      In 1992, Dr. Gregory Rogers a NASA flight surgeon and former Chief of Aerospace Medicine witnessed an event that would stay with him for more than three decades. Now, after years of silence, he’s finally revealing the details of a 15-minute encounter that shattered everything he thought he knew about aerospace technology. 

      With a distinguished career that includes support for 31 space shuttle launches, training as an F-16 pilot, and deep involvement in classified aerospace programs, Dr. Rogers brings unmatched credibility to the conversation. His firsthand account of observing what appeared to be a reverse-engineered craft, emblazoned with "U.S. Air Force" markings, raises profound questions about the true timeline of UAP development and disclosure. 
      The full interview spans nearly two hours. To help navigate the discussion, here’s a timeline so you can jump to the segments that interest you most. 
      00:00 Introduction and Dr. Rogers' Unprecedented Credentials 07:25 The 1992 Cape Canaveral Encounter Begins 18:45 Inside the Hangar: First Glimpse of the Craft 26:30 "We Got It From Them" - The Shocking Revelation 35:15 Technical Analysis: Impossible Flight Characteristics 43:40 Electromagnetic Discharges and Advanced Propulsion 52:20 The Cover Story and 33 Years of Silence 1:01:10 Why He's Speaking Out Now: Grush and Fravor's Influence 1:08:45 Bob Lazar Connections and Reverse Engineering Timeline 1:17:20 Flight Surgeon Stories: The Human Side of Classified Work 1:25:50 G-Force Brain Injuries: An Unreported Military Crisis 1:34:30 Columbia Disaster: When Safety Warnings Are Ignored 1:43:15 The Bureaucratic Resistance to Truth 1:50:40 Congressional Testimony and The Path Forward 1:58:25 Final Thoughts: Legacy vs. Truth
        View the full article
    • By NASA
      The Roscosmos Progress 90 cargo craft approaches the International Space Station for a docking to the Poisk module delivering nearly three tons of food, fuel, and supplies replenishing the Expedition 72 crew. Credit: NASA NASA will provide live coverage of the launch and docking of a Roscosmos cargo spacecraft delivering approximately three tons of food, fuel, and supplies to the Expedition 73 crew aboard the International Space Station.
      The unpiloted Roscosmos Progress 92 spacecraft is scheduled to launch at 3:32 p.m. EDT, Thursday, July 3 (12:32 a.m. Baikonur time, Friday, July 4), on a Soyuz rocket from the Baikonur Cosmodrome in Kazakhstan.
      Live launch coverage will begin at 3:10 p.m. on NASA+. Learn how to watch NASA content through a variety of platforms, including social media.
      After a two-day, in-orbit journey to the station, the spacecraft will dock autonomously to the space-facing port of the orbiting laboratory’s Poisk module at 5:27 p.m. on Saturday, July 5. NASA’s rendezvous and docking coverage will begin at 4:45 p.m. on NASA+.
      The Progress 92 spacecraft will remain docked to the space station for approximately six months before departing for re-entry into Earth’s atmosphere to dispose of trash loaded by the crew.
      Ahead of the spacecraft’s arrival, the Progress 90 spacecraft will undock from the Poisk module on Tuesday, July 1. NASA will not stream undocking.
      The International Space Station is a convergence of science, technology, and human innovation that enables research not possible on Earth. For nearly 25 years, NASA has supported a continuous U.S. human presence aboard the orbiting laboratory, through which astronauts have learned to live and work in space for extended periods of time. The space station is a springboard for developing a low Earth economy and NASA’s next great leaps in exploration, including missions to the Moon under Artemis and, ultimately, human exploration of Mars.
      Learn more about the International Space Station, its research, and crew, at:
      https://www.nasa.gov/station
      -end-
      Jimi Russell
      Headquarters, Washington
      202-358-1100
      james.j.russell@nasa.gov  
      Sandra Jones / Joseph Zakrzewski
      Johnson Space Center, Houston
      281-483-5111
      sandra.p.jones@nasa.gov / joseph.a.zakrzewski@nasa.gov
      Share
      Details
      Last Updated Jun 30, 2025 LocationNASA Headquarters Related Terms
      Humans in Space International Space Station (ISS) Johnson Space Center NASA Headquarters View the full article
    • By NASA
      Artist’s concept.Credit: NASA NASA announced Monday its latest plans to team up with a streaming service to bring space a little closer to home. Starting this summer, NASA+ live programming will be available on Netflix.
      Audiences now will have another option to stream rocket launches, astronaut spacewalks, mission coverage, and breathtaking live views of Earth from the International Space Station.
      “The National Aeronautics and Space Act of 1958 calls on us to share our story of space exploration with the broadest possible audience,” said Rebecca Sirmons, general manager of NASA+ at the agency’s headquarters in Washington. “Together, we’re committed to a Golden Age of Innovation and Exploration – inspiring new generations – right from the comfort of their couch or in the palm of their hand from their phone.”
      Through this partnership, NASA’s work in science and exploration will become even more accessible, allowing the agency to increase engagement with and inspire a global audience in a modern media landscape, where Netflix reaches a global audience of more than 700 million people.
      The agency’s broader efforts include connecting with as many people as possible through video, audio, social media, and live events. The goal is simple: to bring the excitement of the agency’s discoveries, inventions, and space exploration to people, wherever they are.
      NASA+ remains available for free, with no ads, through the NASA app and on the agency’s website.
      Additional programming details and schedules will be announced ahead of launch.
      For more about NASA’s missions, visit:
      https://www.nasa.gov
      -end-
      Cheryl Warner
      Headquarters, Washington
      202-358-1600
      cheryl.m.warner@nasa.gov
      Share
      Details
      Last Updated Jun 30, 2025 LocationNASA Headquarters Related Terms
      Brand Partnerships NASA+ View the full article
    • By NASA
      6 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      NASA Ames research scientist Kristina Pistone monitors instrument data while onboard the Twin Otter aircraft, flying over Monterey Bay during the October 2024 deployment of the AirSHARP campaign. NASA/Samuel Leblanc In autumn 2024, California’s Monterey Bay experienced an outsized phytoplankton bloom that attracted fish, dolphins, whales, seabirds, and – for a few weeks in October – scientists. A team from NASA’s Ames Research Center in Silicon Valley, with partners at the University of California, Santa Cruz (UCSC), and the Naval Postgraduate School, spent two weeks on the California coast gathering data on the atmosphere and the ocean to verify what satellites see from above. In spring 2025, the team returned to gather data under different environmental conditions.

      Scientists call this process validation.

      Setting up the Campaign

      The PACE mission, which stands for Plankton, Aerosol, Cloud, ocean Ecosystem, was launched in February  2024 and designed to transform our understanding of ocean and atmospheric environments. Specifically, the satellite will give scientists a finely detailed look at life near the ocean surface and the composition and abundance of aerosol particles in the atmosphere.

      Whenever NASA launches a new satellite, it sends validation science teams around the world to confirm that the data from instruments in space match what traditional instruments can see at the surface. AirSHARP (Airborne aSsessment of Hyperspectral Aerosol optical depth and water-leaving Reflectance Product Performance for PACE) is one of these teams, specifically deployed to validate products from the satellite’s Ocean Color Instrument (OCI).

      The OCI spectrometer works by measuring reflected sunlight. As sunlight bounces off of the ocean’s surface, it creates specific shades of color that researchers use to determine what is in the water column below. To validate the OCI data, research teams need to confirm that measurements directly at the surface match those from the satellite. They also need to understand how the atmosphere is changing the color of the ocean as the reflected light is traveling back to the satellite.

      In October 2024 and May 2025, the AirSHARP team ran simultaneous airborne and seaborne campaigns. Going into the field during different seasons allows the team to collect data under different environmental conditions, validating as much of the instrument’s range as possible.

      Over 13 days of flights on a Twin Otter aircraft, the NASA-led team used instruments called 4STAR-B (Spectrometer for sky-scanning sun Tracking Atmospheric Research B), and the C-AIR (Coastal Airborne In-situ Radiometer) to gather data from the air. At the same time, partners from UCSC used a host of matching instruments onboard the research vessel R/V Shana Rae to gather data from the water’s surface.

      Ocean Color and Water Leaving Reflectance

      The Ocean Color Instrument measures something called water leaving reflectance, which provides information on the microscopic composition of the water column, including water molecules, phytoplankton, and particulates like sand, inorganic materials, and even bubbles. Ocean color varies based on how these materials absorb and scatter sunlight. This is especially useful for determining the abundance and types of phytoplankton.

      Photographs taken out the window of the Twin Otter aircraft during the October 2024 AirSHARP deployment showcase the variation in ocean color, which indicates different molecular composition of the water column beneath. The red color in several of these photos is due to a phytoplankton bloom – in this case a growth of red algae. NASA/Samuel Leblanc
      The AirSHARP team used radiometers with matching technology – C-AIR from the air and C-OPS (Compact Optical Profiling System) from the water – to gather water leaving reflectance data.

      “The C-AIR instrument is modified from an instrument that goes on research vessels and takes measurements of the water’s surface from very close range,” said NASA Ames research scientist Samuel LeBlanc. “The issue there is that you’re very local to one area at a time. What our team has done successfully is put it on an aircraft, which enables us to span the entire Monterey Bay.”

      The larger PACE validation team will compare OCI measurements with observations made by the sensors much closer to the ocean to ensure that they match, and make adjustments when they don’t. 

      Aerosol Interference

      One factor that can impact OCI data is the presence of manmade and natural aerosols, which interact with sunlight as it moves through the atmosphere. An aerosol refers to any solid or liquid suspended in the air, such as smoke from fires, salt from sea spray, particulates from fossil fuel emissions, desert dust, and pollen.

      Imagine a 420 mile-long tube, with the PACE satellite at one end and the ocean at the other. Everything inside the tube is what scientists refer to as the atmospheric column, and it is full of tiny particulates that interact with sunlight. Scientists quantify this aerosol interaction with a measurement called aerosol optical depth.

      “During AirSHARP, we were essentially measuring, at different wavelengths, how light is changed by the particles present in the atmosphere,” said NASA Ames research scientist Kristina Pistone. “The aerosol optical depth is a measure of light extinction, or how much light is either scattered away or absorbed by aerosol particulates.” 

      The team measured aerosol optical depth using the 4STAR-B spectrometer, which was engineered at NASA Ames and  enables scientists to identify which aerosols are present and how they interact with sunlight.

      Twin Otter Aircraft

      AirSHARP principal investigator Liane Guild walks towards a Twin Otter aircraft owned and operated by the Naval Postgraduate School. The aircraft’s ability to perform complex, low-altitude flights made it the ideal platform to fly multiple instruments over Monterey Bay during the AirSHARP campaign. NASA/Samuel Leblanc
      Flying these instruments required use of a Twin Otter plane, operated by the Naval Postgraduate School (NPS). The Twin Otter is unique for its ability to perform extremely low-altitude flights, making passes down to 100 feet above the water in clear conditions.

      “It’s an intense way to fly. At that low height, the pilots continually watch for and avoid birds, tall ships, and even wildlife like breaching whales,” said Anthony Bucholtz, director of the Airborne Research Facility at NPS.

      With the phytoplankton bloom attracting so much wildlife in a bay already full of ships, this is no small feat. “The pilots keep a close eye on the radar, and fly by hand,” Bucholtz said, “all while following careful flight plans crisscrossing Monterey Bay and performing tight spirals over the Research Vessel Shana Rae.”

      Campaign Data

      Data gathered from the 2024 phase of this campaign is available on two data archive systems. Data from the 4STAR instrument is available in the PACE data archive  and data from C-AIR is housed in the SeaBASS data archive.

      Other data from the NASA PACE Validation Science Team is available through the PACE website: https://pace.oceansciences.org/pvstdoi.htm#
      Samuel LeBlanc and Kristina Pistone are funded via the Bay Area Environmental Research Institute (BAERI), which  is a scientist-founded nonprofit focused on supporting Earth and space sciences.
      About the Author
      Milan Loiacono
      Science Communication SpecialistMilan Loiacono is a science communication specialist for the Earth Science Division at NASA Ames Research Center.
      Share
      Details
      Last Updated Jun 26, 2025 Related Terms
      Ames Research Center's Science Directorate Ames Research Center Earth Earth Science Earth Science Division PACE (Plankton, Aerosol, Cloud, Ocean Ecosystem) Science Mission Directorate Explore More
      2 min read NASA Citizen Scientists Find New Eclipsing Binary Stars
      When two stars orbit one another in such a way that one blocks the other’s…
      Article 32 minutes ago 4 min read NASA-Assisted Scientists Get Bird’s-Eye View of Population Status
      NASA satellite data and citizen science observations combine for new findings on bird populations.
      Article 22 hours ago 2 min read Live or Fly a Plane in California? Help NASA Measure Ozone Pollution!
      Ozone high in the stratosphere protects us from the Sun’s ultraviolet light. But ozone near…
      Article 2 days ago View the full article
    • By NASA
      An artist’s concept of NASA’s Orion spacecraft orbiting the Moon while using laser communications technology through the Orion Artemis II Optical Communications System.Credit: NASA/Dave Ryan As NASA prepares for its Artemis II mission, researchers at the agency’s Glenn Research Center in Cleveland are collaborating with The Australian National University (ANU) to prove inventive, cost-saving laser communications technologies in the lunar environment.
      Communicating in space usually relies on radio waves, but NASA is exploring laser, or optical, communications, which can send data 10 to 100 times faster to the ground. Instead of radio signals, these systems use infrared light to transmit high-definition video, picture, voice, and science data across vast distances in less time. NASA has proven laser communications during previous technology demonstrations, but Artemis II will be the first crewed mission to attempt using lasers to transmit data from deep space.
      To support this effort, researchers working on the agency’s Real Time Optical Receiver (RealTOR) project have developed a cost-effective laser transceiver using commercial-off-the-shelf parts. Earlier this year, NASA Glenn engineers built and tested a replica of the system at the center’s Aerospace Communications Facility, and they are now working with ANU to build a system with the same hardware models to prepare for the university’s Artemis II laser communications demo.
      “Australia’s upcoming lunar experiment could showcase the capability, affordability, and reproducibility of the deep space receiver engineered by Glenn,” said Jennifer Downey, co-principal investigator for the RealTOR project at NASA Glenn. “It’s an important step in proving the feasibility of using commercial parts to develop accessible technologies for sustainable exploration beyond Earth.”

      During Artemis II, which is scheduled for early 2026, NASA will fly an optical communications system aboard the Orion spacecraft, which will test using lasers to send data across the cosmos. During the mission, NASA will attempt to transmit recorded 4K ultra-high-definition video, flight procedures, pictures, science data, and voice communications from the Moon to Earth.
      An artist’s concept of the optical communications ground station at Mount Stromlo Observatory in Canberra, Australia, using laser communications technology.Credit: The Australian National University Nearly 10,000 miles from Cleveland, ANU researchers working at the Mount Stromlo Observatory ground station hope to receive data during Orion’s journey around the Moon using the Glenn-developed transceiver model. This ground station will serve as a test location for the new transceiver design and will not be one of the mission’s primary ground stations. If the test is successful, it will prove that commercial parts can be used to build affordable, scalable space communication systems for future missions to the Moon, Mars, and beyond.
      “Engaging with The Australian National University to expand commercial laser communications offerings across the world will further demonstrate how this advanced satellite communications capability is ready to support the agency’s networks and missions as we set our sights on deep space exploration,” said Marie Piasecki, technology portfolio manager for NASA’s Space Communications and Navigation (SCaN) Program.
      As NASA continues to investigate the feasibility of using commercial parts to engineer ground stations, Glenn researchers will continue to provide critical support in preparation for Australia’s demonstration.

      Strong global partnerships advance technology breakthroughs and are instrumental as NASA expands humanity’s reach from the Moon to Mars, while fueling innovations that improve life on Earth. Through Artemis, NASA will send astronauts to explore the Moon for scientific discovery, economic benefits, and build the foundation for the first crewed missions to Mars.
      The Real Time Optical Receiver (RealTOR) team poses for a group photo in the Aerospace Communications Facility at NASA’s Glenn Research Center in Cleveland on Friday, Dec. 13, 2024. From left to right: Peter Simon, Sarah Tedder, John Clapham, Elisa Jager, Yousef Chahine, Michael Marsden, Brian Vyhnalek, and Nathan Wilson.Credit: NASA The RealTOR project is one aspect of the optical communications portfolio within NASA’s SCaN Program, which includes demonstrations and in-space experiment platforms to test the viability of infrared light for sending data to and from space. These include the LCOT (Low-Cost Optical Terminal) project, the Laser Communications Relay Demonstration, and more. NASA Glenn manages the project under the direction of agency’s SCaN Program at NASA Headquarters in Washington.
      The Australian National University’s demonstration is supported by the Australian Space Agency Moon to Mars Demonstrator Mission Grant program, which has facilitated operational capability for the Australian Deep Space Optical Ground Station Network.
      To learn how space communications and navigation capabilities support every agency mission, visit:
      https://www.nasa.gov/communicating-with-missions


      Explore More
      3 min read NASA Engineers Simulate Lunar Lighting for Artemis III Moon Landing
      Article 1 week ago 2 min read NASA Seeks Commercial Feedback on Space Communication Solutions
      Article 1 week ago 4 min read NASA, DoD Practice Abort Scenarios Ahead of Artemis II Moon Mission
      Article 2 weeks ago View the full article
  • Check out these Videos

×
×
  • Create New...