Members Can Post Anonymously On This Site
Satellites map record floods in Australia
-
Similar Topics
-
By NASA
5 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Cloud cover can keep optical instruments on satellites from clearly capturing Earth’s surface. Still in testing, JPL’s Dynamic Targeting uses AI to avoid imaging clouds, yielding a higher proportion of usable data, and to focus on phenomena like this 2015 volcanic eruption in Indonesia Landsat 8 captured.NASA/USGS A technology called Dynamic Targeting could enable spacecraft to decide, autonomously and within seconds, where to best make science observations from orbit.
In a recent test, NASA showed how artificial intelligence-based technology could help orbiting spacecraft provide more targeted and valuable science data. The technology enabled an Earth-observing satellite for the first time to look ahead along its orbital path, rapidly process and analyze imagery with onboard AI, and determine where to point an instrument. The whole process took less than 90 seconds, without any human involvement.
Called Dynamic Targeting, the concept has been in development for more than a decade at NASA’s Jet Propulsion Laboratory in Southern California. The first of a series of flight tests occurred aboard a commercial satellite in mid-July. The goal: to show the potential of Dynamic Targeting to enable orbiters to improve ground imaging by avoiding clouds and also to autonomously hunt for specific, short-lived phenomena like wildfires, volcanic eruptions, and rare storms.
This graphic shows how JPL’s Dynamic Targeting uses a lookahead sensor to see what’s on a satellite’s upcoming path. Onboard algorithms process the sensor’s data, identifying clouds to avoid and targets of interest for closer observation as the satellite passes overhead.NASA/JPL-Caltech “The idea is to make the spacecraft act more like a human: Instead of just seeing data, it’s thinking about what the data shows and how to respond,” says Steve Chien, a technical fellow in AI at JPL and principal investigator for the Dynamic Targeting project. “When a human sees a picture of trees burning, they understand it may indicate a forest fire, not just a collection of red and orange pixels. We’re trying to make the spacecraft have the ability to say, ‘That’s a fire,’ and then focus its sensors on the fire.”
Avoiding Clouds for Better Science
This first flight test for Dynamic Targeting wasn’t hunting specific phenomena like fires — that will come later. Instead, the point was avoiding an omnipresent phenomenon: clouds.
Most science instruments on orbiting spacecraft look down at whatever is beneath them. However, for Earth-observing satellites with optical sensors, clouds can get in the way as much as two-thirds of the time, blocking views of the surface. To overcome this, Dynamic Targeting looks 300 miles (500 kilometers) ahead and has the ability to distinguish between clouds and clear sky. If the scene is clear, the spacecraft images the surface when passing overhead. If it’s cloudy, the spacecraft cancels the imaging activity to save data storage for another target.
“If you can be smart about what you’re taking pictures of, then you only image the ground and skip the clouds. That way, you’re not storing, processing, and downloading all this imagery researchers really can’t use,” said Ben Smith of JPL, an associate with NASA’s Earth Science Technology Office, which funds the Dynamic Targeting work. “This technology will help scientists get a much higher proportion of usable data.”
How Dynamic Targeting Works
The testing is taking place on CogniSAT-6, a briefcase-size CubeSat that launched in March 2024. The satellite — designed, built, and operated by Open Cosmos — hosts a payload designed and developed by Ubotica featuring a commercially available AI processor. While working with Ubotica in 2022, Chien’s team conducted tests aboard the International Space Station running algorithms similar to those in Dynamic Targeting on the same type of processor. The results showed the combination could work for space-based remote sensing.
Since CogniSAT-6 lacks an imager dedicated to looking ahead, the spacecraft tilts forward 40 to 50 degrees to point its optical sensor, a camera that sees both visible and near-infrared light. Once look-ahead imagery has been acquired, Dynamic Targeting’s advanced algorithm, trained to identify clouds, analyzes it. Based on that analysis, the Dynamic Targeting planning software determines where to point the sensor for cloud-free views. Meanwhile, the satellite tilts back toward nadir (looking directly below the spacecraft) and snaps the planned imagery, capturing only the ground.
This all takes place in 60 to 90 seconds, depending on the original look-ahead angle, as the spacecraft speeds in low Earth orbit at nearly 17,000 mph (7.5 kilometers per second).
What’s Next
With the cloud-avoidance capability now proven, the next test will be hunting for storms and severe weather — essentially targeting clouds instead of avoiding them. Another test will be to search for thermal anomalies like wildfires and volcanic eruptions. The JPL team developed unique algorithms for each application.
“This initial deployment of Dynamic Targeting is a hugely important step,” Chien said. “The end goal is operational use on a science mission, making for a very agile instrument taking novel measurements.”
There are multiple visions for how that could happen — possibly even on spacecraft exploring the solar system. In fact, Chien and his JPL colleagues drew some inspiration for their Dynamic Targeting work from another project they had also worked on: using data from ESA’s (the European Space Agency’s) Rosetta orbiter to demonstrate the feasibility of autonomously detecting and imaging plumes emitted by comet 67P/Churyumov-Gerasimenko.
On Earth, adapting Dynamic Targeting for use with radar could allow scientists to study dangerous extreme winter weather events called deep convective ice storms, which are too rare and short-lived to closely observe with existing technologies. Specialized algorithms would identify these dense storm formations with a satellite’s look-ahead instrument. Then a powerful, focused radar would pivot to keep the ice clouds in view, “staring” at them as the spacecraft speeds by overhead and gathers a bounty of data over six to eight minutes.
Some ideas involve using Dynamic Targeting on multiple spacecraft: The results of onboard image analysis from a leading satellite could be rapidly communicated to a trailing satellite, which could be tasked with targeting specific phenomena. The data could even be fed to a constellation of dozens of orbiting spacecraft. Chien is leading a test of that concept, called Federated Autonomous MEasurement, beginning later this year.
How AI supports Mars rover science Autonomous robot fleet could measure ice shelf melt Ocean world robot swarm prototype gets a swim test News Media Contact
Melissa Pamer
Jet Propulsion Laboratory, Pasadena, Calif.
626-314-4928
melissa.pamer@jpl.nasa.gov
2025-094
Share
Details
Last Updated Jul 24, 2025 Related Terms
Earth Science Earth Science Technology Office Jet Propulsion Laboratory Explore More
5 min read NASA Shares How to Save Camera 370-Million-Miles Away Near Jupiter
Article 3 days ago 2 min read GLOBE-Trotting Science Lands in Chesapeake with NASA eClips
On June 16-17, 2025, 50 students at Camp Young in Chesapeake, Virginia traded their usual…
Article 3 days ago 6 min read 5 Things to Know About Powerful New U.S.-India Satellite, NISAR
Article 3 days ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
5 min read
How NASA’s SPHEREx Mission Will Share Its All-Sky Map With the World
NASA’s SPHEREx mission will map the entire sky in 102 different wavelengths, or colors, of infrared light. This image of the Vela Molecular Ridge was captured by SPHEREx and is part of the mission’s first ever public data release. The yellow patch on the right side of the image is a cloud of interstellar gas and dust that glows in some infrared colors due to radiation from nearby stars. NASA/JPL-Caltech NASA’s newest astrophysics space telescope launched in March on a mission to create an all-sky map of the universe. Now settled into low-Earth orbit, SPHEREx (Spectro-Photometer for the History of the Universe, Epoch of Reionization, and Ices Explorer) has begun delivering its sky survey data to a public archive on a weekly basis, allowing anyone to use the data to probe the secrets of the cosmos.
“Because we’re looking at everything in the whole sky, almost every area of astronomy can be addressed by SPHEREx data,” said Rachel Akeson, the lead for the SPHEREx Science Data Center at IPAC. IPAC is a science and data center for astrophysics and planetary science at Caltech in Pasadena, California.
Almost every area of astronomy can be addressed by SPHEREx data.
Rachel Akeson
SPHEREx Science Data Center Lead
Other missions, like NASA’s now-retired WISE (Wide-field Infrared Survey Explorer), have also mapped the entire sky. SPHEREx builds on this legacy by observing in 102 infrared wavelengths, compared to WISE’s four wavelength bands.
By putting the many wavelength bands of SPHEREx data together, scientists can identify the signatures of specific molecules with a technique known as spectroscopy. The mission’s science team will use this method to study the distribution of frozen water and organic molecules — the “building blocks of life” — in the Milky Way.
This animation shows how NASA’s SPHEREx observatory will map the entire sky — a process it will complete four times over its two-year mission. The telescope will observe every point in the sky in 102 different infrared wavelengths, more than any other all-sky survey. SPHEREx’s openly available data will enable a wide variety of astronomical studies. Credit: NASA/JPL-Caltech The SPHEREx science team will also use the mission’s data to study the physics that drove the universe’s expansion following the big bang, and to measure the amount of light emitted by all the galaxies in the universe over time. Releasing SPHEREx data in a public archive encourages far more astronomical studies than the team could do on their own.
“By making the data public, we enable the whole astronomy community to use SPHEREx data to work on all these other areas of science,” Akeson said.
NASA is committed to the sharing of scientific data, promoting transparency and efficiency in scientific research. In line with this commitment, data from SPHEREx appears in the public archive within 60 days after the telescope collects each observation. The short delay allows the SPHEREx team to process the raw data to remove or flag artifacts, account for detector effects, and align the images to the correct astronomical coordinates.
The team publishes the procedures they used to process the data alongside the actual data products. “We want enough information in those files that people can do their own research,” Akeson said.
One of the early test images captured by NASA’s SPHEREx mission in April 2025. This image shows a section of sky in one infrared wavelength, or color, that is invisible to the human eye but is represented here in a visible color. This particular wavelength (3.29 microns) reveals a cloud of dust made of a molecule similar to soot or smoke. NASA/JPL-Caltech This image from NASA’s SPHEREx shows the same region of space in a different infrared wavelength (0.98 microns), once again represented by a color that is visible to the human eye. The dust cloud has vanished because the molecules that make up the dust — polycyclic aromatic hydrocarbons — do not radiate light in this color. NASA/JPL-Caltech
During its two-year prime mission, SPHEREx will survey the entire sky twice a year, creating four all-sky maps. After the mission reaches the one-year mark, the team plans to release a map of the whole sky at all 102 wavelengths.
In addition to the science enabled by SPHEREx itself, the telescope unlocks an even greater range of astronomical studies when paired with other missions. Data from SPHEREx can be used to identify interesting targets for further study by NASA’s James Webb Space Telescope, refine exoplanet parameters collected from NASA’s TESS (Transiting Exoplanet Survey Satellite), and study the properties of dark matter and dark energy along with ESA’s (European Space Agency’s) Euclid mission and NASA’s upcoming Nancy Grace Roman Space Telescope.
The SPHEREx mission’s all-sky survey will complement data from other NASA space telescopes. SPHEREx is illustrated second from the right. The other telescope illustrations are, from left to right: the Hubble Space Telescope, the retired Spitzer Space Telescope, the retired WISE/NEOWISE mission, the James Webb Space Telescope, and the upcoming Nancy Grace Roman Space Telescope. NASA/JPL-Caltech The IPAC archive that hosts SPHEREx data, IRSA (NASA/IPAC Infrared Science Archive), also hosts pointed observations and all-sky maps at a variety of wavelengths from previous missions. The large amount of data available through IRSA gives users a comprehensive view of the astronomical objects they want to study.
“SPHEREx is part of the entire legacy of NASA space surveys,” said IRSA Science Lead Vandana Desai. “People are going to use the data in all kinds of ways that we can’t imagine.”
NASA’s Office of the Chief Science Data Officer leads open science efforts for the agency. Public sharing of scientific data, tools, research, and software maximizes the impact of NASA’s science missions. To learn more about NASA’s commitment to transparency and reproducibility of scientific research, visit science.nasa.gov/open-science. To get more stories about the impact of NASA’s science data delivered directly to your inbox, sign up for the NASA Open Science newsletter.
By Lauren Leese
Web Content Strategist for the Office of the Chief Science Data Officer
More About SPHEREx
The SPHEREx mission is managed by NASA’s Jet Propulsion Laboratory for the agency’s Astrophysics Division within the Science Mission Directorate at NASA Headquarters. BAE Systems in Boulder, Colorado, built the telescope and the spacecraft bus. The science analysis of the SPHEREx data will be conducted by a team of scientists located at 10 institutions in the U.S., two in South Korea, and one in Taiwan. Caltech in Pasadena managed and integrated the instrument. The mission’s principal investigator is based at Caltech with a joint JPL appointment. Data will be processed and archived at IPAC at Caltech. The SPHEREx dataset will be publicly available at the NASA-IPAC Infrared Science Archive. Caltech manages JPL for NASA.
To learn more about SPHEREx, visit:
https://nasa.gov/SPHEREx
Media Contacts
Calla Cofield
Jet Propulsion Laboratory, Pasadena, Calif.
626-808-2469
calla.e.cofield@jpl.nasa.gov
Amanda Adams
Office of the Chief Science Data Officer
256-683-6661
amanda.m.adams@nasa.gov
Share
Details
Last Updated Jul 02, 2025 Related Terms
Open Science Astrophysics Galaxies Jet Propulsion Laboratory SPHEREx (Spectro-Photometer for the History of the Universe and Ices Explorer) The Search for Life The Universe Explore More
3 min read Discovery Alert: Flaring Star, Toasted Planet
Article
4 hours ago
11 min read 3 Years of Science: 10 Cosmic Surprises from NASA’s Webb Telescope
Article
5 hours ago
7 min read A New Alloy is Enabling Ultra-Stable Structures Needed for Exoplanet Discovery
Article
1 day ago
Keep Exploring Discover More Topics From NASA
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
An artist’s concept of NASA’s Orion spacecraft orbiting the Moon while using laser communications technology through the Orion Artemis II Optical Communications System.Credit: NASA/Dave Ryan As NASA prepares for its Artemis II mission, researchers at the agency’s Glenn Research Center in Cleveland are collaborating with The Australian National University (ANU) to prove inventive, cost-saving laser communications technologies in the lunar environment.
Communicating in space usually relies on radio waves, but NASA is exploring laser, or optical, communications, which can send data 10 to 100 times faster to the ground. Instead of radio signals, these systems use infrared light to transmit high-definition video, picture, voice, and science data across vast distances in less time. NASA has proven laser communications during previous technology demonstrations, but Artemis II will be the first crewed mission to attempt using lasers to transmit data from deep space.
To support this effort, researchers working on the agency’s Real Time Optical Receiver (RealTOR) project have developed a cost-effective laser transceiver using commercial-off-the-shelf parts. Earlier this year, NASA Glenn engineers built and tested a replica of the system at the center’s Aerospace Communications Facility, and they are now working with ANU to build a system with the same hardware models to prepare for the university’s Artemis II laser communications demo.
“Australia’s upcoming lunar experiment could showcase the capability, affordability, and reproducibility of the deep space receiver engineered by Glenn,” said Jennifer Downey, co-principal investigator for the RealTOR project at NASA Glenn. “It’s an important step in proving the feasibility of using commercial parts to develop accessible technologies for sustainable exploration beyond Earth.”
During Artemis II, which is scheduled for early 2026, NASA will fly an optical communications system aboard the Orion spacecraft, which will test using lasers to send data across the cosmos. During the mission, NASA will attempt to transmit recorded 4K ultra-high-definition video, flight procedures, pictures, science data, and voice communications from the Moon to Earth.
An artist’s concept of the optical communications ground station at Mount Stromlo Observatory in Canberra, Australia, using laser communications technology.Credit: The Australian National University Nearly 10,000 miles from Cleveland, ANU researchers working at the Mount Stromlo Observatory ground station hope to receive data during Orion’s journey around the Moon using the Glenn-developed transceiver model. This ground station will serve as a test location for the new transceiver design and will not be one of the mission’s primary ground stations. If the test is successful, it will prove that commercial parts can be used to build affordable, scalable space communication systems for future missions to the Moon, Mars, and beyond.
“Engaging with The Australian National University to expand commercial laser communications offerings across the world will further demonstrate how this advanced satellite communications capability is ready to support the agency’s networks and missions as we set our sights on deep space exploration,” said Marie Piasecki, technology portfolio manager for NASA’s Space Communications and Navigation (SCaN) Program.
As NASA continues to investigate the feasibility of using commercial parts to engineer ground stations, Glenn researchers will continue to provide critical support in preparation for Australia’s demonstration.
Strong global partnerships advance technology breakthroughs and are instrumental as NASA expands humanity’s reach from the Moon to Mars, while fueling innovations that improve life on Earth. Through Artemis, NASA will send astronauts to explore the Moon for scientific discovery, economic benefits, and build the foundation for the first crewed missions to Mars.
The Real Time Optical Receiver (RealTOR) team poses for a group photo in the Aerospace Communications Facility at NASA’s Glenn Research Center in Cleveland on Friday, Dec. 13, 2024. From left to right: Peter Simon, Sarah Tedder, John Clapham, Elisa Jager, Yousef Chahine, Michael Marsden, Brian Vyhnalek, and Nathan Wilson.Credit: NASA The RealTOR project is one aspect of the optical communications portfolio within NASA’s SCaN Program, which includes demonstrations and in-space experiment platforms to test the viability of infrared light for sending data to and from space. These include the LCOT (Low-Cost Optical Terminal) project, the Laser Communications Relay Demonstration, and more. NASA Glenn manages the project under the direction of agency’s SCaN Program at NASA Headquarters in Washington.
The Australian National University’s demonstration is supported by the Australian Space Agency Moon to Mars Demonstrator Mission Grant program, which has facilitated operational capability for the Australian Deep Space Optical Ground Station Network.
To learn how space communications and navigation capabilities support every agency mission, visit:
https://www.nasa.gov/communicating-with-missions
Explore More
3 min read NASA Engineers Simulate Lunar Lighting for Artemis III Moon Landing
Article 1 week ago 2 min read NASA Seeks Commercial Feedback on Space Communication Solutions
Article 1 week ago 4 min read NASA, DoD Practice Abort Scenarios Ahead of Artemis II Moon Mission
Article 2 weeks ago View the full article
-
By NASA
Heading into a recent staff meeting for Johnson Space Center’s Business Development & Technology Integration Office, Jason Foster anticipated a typical agenda of team updates and discussion. He did not expect an announcement that he had been named a 2025 Rookie of the Year – Honorable Mention through the Federal Laboratory Consortium’s annual awards program.
Foster was one of only three technology transfer professionals across the federal government to be recognized in the Rookie of the Year category, which is open to early-career individuals with less than three years of experience. “It was definitely a surprise,” he said. “It was quite an honor, because it’s not only representing Johnson Space Center but also NASA.”
Jason Foster recognized at the Federal Laboratory Consortium Award Ceremony as a Rookie of the Year – Honorable Mention.Image courtesy of Jason Foster Foster is a licensing specialist and New Technology Report (NTR) specialist within Johnson’s Technology Transfer Office in Houston. That team works to ensure that innovations developed for aeronautics and space exploration are made broadly available to the public, maximizing their benefit to the nation. Foster’s role involves both capturing new technologies developed at Johnson and marketing and licensing those technologies to companies that would like to use and further develop them.
He describes much of his work as “technology hunting” – reaching out to branches, offices, and teams across Johnson to teach them about the Technology Transfer Office, NTRs, and the value of technology reporting for NASA and the public. “NTRs are the foundation that allows our office to do our job,” he said. “We need to know about a technology in order to transfer it.”
Jason Foster (left) visited NASA’s White Sands Test Facility in Las Cruces, New Mexico, with his colleague Edgar Castillo as part of the Technology Transfer Office’s work to capture new technology and innovations developed at Johnson and affiliated facilities. Image courtesy of Jason Foster Foster’s efforts to streamline and strengthen the reporting and patenting of Johnson’s innovations led to his recognition by the consortium. His proactive outreach and relationship-building improved customer service and contributed to 158 NTRs in fiscal year 2024 – the highest number of NTRs disclosed by federal employees at any NASA center. Foster also proposed a three-month NTR sprint, during which he led a team of seven in an intensive exercise to identify and report new technologies. This initiative not only cleared a backlog of leads for the office, but also resulted in more than 120 previously undisclosed NTRs. “We are still using that process now as we continue processing NTRs,” Foster said. On top of those achievements, he helped secure the highest recorded number of license agreements with commercial entities in the center’s history, with 41 licenses executed in fiscal year 2024.
“I am very proud of my accomplishments, none of it would be possible without the open-mindedness and continuous support of my incredible team,” Foster said. “They have always provided a space to grow, and actively welcome innovation in our processes and workflows.”
Jason Foster educated Johnson employees about the Technology Transfer Office and the importance of submitting New Technology Reports during the center’s annual Innovation Showcase.Image courtesy of Jason Foster A self-described “space nerd,” Foster said he always envisioned working at NASA, but not until much later in his career – ideally as an astronaut. He initially planned to pursue an astrophysics degree but discovered a passion for engineering and fused that with his love of space by studying aerospace, aeronautical, and astronautical engineering instead. In his last semester of college at California Polytechnic State University of San Luis Obispo, he landed a Universities Space Research Association internship at Johnson, supporting flight software development for crew exercise systems on the International Space Station and future exploration missions. “I got really involved in the Johnson Space Center team and the work, and I thought, what if I joined NASA now?”
He was hired as a licensing specialist on the Technology Transfer team under the JETS II Contract as an Amentum employee shortly after graduating and continually seeks new opportunities to expand his role and skillsets. “The more I can learn about anything NASA’s doing is incredible,” he said. “I found myself in this perfect position where literally my job is to learn everything there is to learn.”
Jason Foster holding up Aerogel during his visit to the Hypervelocity Impact Testing Laboratory at NASA’s White Sands Test Facility in Las Cruces, New Mexico. The visit was part of the Technology Transfer Office’s work to capture new technology and innovations developed at Johnson and affiliated facilities. Image courtesy of Jason Foster Foster celebrates three years with NASA this July. In his time at the agency, he has learned the value of getting to know and understand your colleagues’ needs in order to help them. Before he meets with someone, he takes time to learn about the organization or team they are a part of, the work they are involved in, and what they might discuss. It is also important to determine how each person prefers to communicate and collaborate. “Doing your homework pays dividends,” Foster said. He has found that being as prepared as possible opens doors to more opportunities, and it helps to save valuable time for busy team members.
Jason Foster practices fire spinning on a California beach. Image courtesy of Jason Foster When he is not technology hunting, you might find Foster practicing the art of fire spinning. He picked up the hobby in college, joining a club that met at local beaches to practice spinning and capturing different geometric patterns through long exposure photos. “It was kind of a strange thing to get into, but it was really fun,” he said. His love of learning drives his interest in other activities as well. Gardening is a relatively new hobby inspired by a realization that he had never grown anything before.
“It’s a genuine joy, I think, coming across something with curiosity and wanting to learn from it,” he said. “I think it especially helps in my job, where your curiosity switch has to be on at least 90% of the time.”
Explore More
4 min read Laser Focused: Keith Barr Leads Orion’s Lunar Docking Efforts
Article 6 days ago 4 min read Johnson’s Paige Whittington Builds a Symphony of Simulations
Article 3 weeks ago 9 min read Station Nation: Meet Megan Harvey, Utilization Flight Lead and Capsule Communicator
Article 4 weeks ago View the full article
-
By NASA
NASA named Stanford University of California winner of the Lunar Autonomy Challenge, a six-month competition for U.S. college and university student teams to virtually map and explore using a digital twin of NASA’s In-Situ Resource Utilization Pilot Excavator (IPEx).
The winning team successfully demonstrated the design and functionality of their autonomous agent, or software that performs specified actions without human intervention. Their agent autonomously navigated the IPEx digital twin in the virtual lunar environment, while accurately mapping the surface, correctly identifying obstacles, and effectively managing available power.
Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Team photo of NAV Lab Lunar Autonomy Challenge from Stanford UniversityCredit: Stanford University’s NAV Lab team The Lunar Autonomy Challenge has been a truly unique experience. The challenge provided the opportunity to develop and test methods in a highly realistic simulation environment."
Adam dai
Lunar Autonomy Challenge team lead, Stanford University
Dai added, “It pushed us to find solutions robust to the harsh conditions of the lunar surface. I learned so much through the challenge, both about new ideas and methods, as well as through deepening my understanding of core methods across the autonomy stack (perception, localization, mapping, planning). I also very much enjoyed working together with my team to brainstorm different approaches and strategies and solve tangible problems observed in the simulation.”
The challenge offered 31 teams a valuable opportunity to gain experience in software development, autonomy, and machine learning using cutting-edge NASA lunar technology. Participants also applied essential skills common to nearly every engineering discipline, including technical writing, collaborative teamwork, and project management.
The Lunar Autonomy Challenge supports NASA’s Lunar Surface Innovation Initiative (LSII), which is part of the Space Technology Mission Directorate. The LSII aims to accelerate technology development and pursue results that will provide essential infrastructure for lunar exploration by collaborating with industry, academia, and other government agencies.
The work displayed by all of these teams has been impressive, and the solutions they have developed are beneficial to advancing lunar and Mars surface technologies as we prepare for increasingly complex missions farther from home.”
Niki Werkheiser
Director of Technology Maturation and LSII lead, NASA Headquarters
“To succeed, we need input from everyone — every idea counts to propel our goals forward. It is very rewarding to see these students and software developers contributing their skills to future lunar and Mars missions,” Werkheiser added.
Through the Lunar Autonomy Challenge, NASA collaborated with the Johns Hopkins Applied Physics Laboratory, Caterpillar Inc., and Embodied AI. Each team contributed unique expertise and tools necessary to make the challenge a success.
The Applied Physics Laboratory managed the challenge for NASA. As a systems integrator for LSII, they provided expertise to streamline rigor and engineering discipline across efforts, ensuring the development of successful, efficient, and cost-effective missions — backed by the world’s largest cohort of lunar scientists.
Caterpillar Inc. is known for its construction and excavation equipment and operates a large fleet of autonomous haul trucks. They also have worked with NASA for more than 20 years on a variety of technologies, including autonomy, 3D printing, robotics, and simulators as they continue to collaborate with NASA on technologies that support NASA’s mission objectives and provide value to the mining and construction industries.
Embodied AI collaborated with Caterpillar to integrate the simulation into the open-source driving environment used for the challenge. For the Lunar Autonomy Challenge, the normally available digital assets of the CARLA simulation platform, such as urban layouts, buildings, and vehicles, were replaced by an IPEx “Digital Twin” and lunar environmental models.
“This collaboration is a great example of how the government, large companies, small businesses, and research institutions can thoughtfully leverage each other’s different, but complementary, strengths,” Werkheiser added. “By substantially modernizing existing tools, we can turn today’s novel technologies into tomorrow’s institutional capabilities for more efficient and effective space exploration, while also stimulating innovation and economic growth on Earth.”
FINALIST TEAMS
First Place
NAV Lab team
Stanford University, Stanford, California
Second Place
MAPLE (MIT Autonomous Pathfinding for Lunar Exploration) team
Massachusetts Institute of Technology, Cambridge, MA
Third Place
Moonlight team
Carnegie Mellon University, Pittsburgh, PA
OTHER COMPETING TEAMS
Lunar ExplorersArizona State UniversityTempe, ArizonaAIWVU West Virginia University Morgantown, West VirginiaStellar Sparks California Polytechnic Institute Pomona Pomona, California LunatiX Johns Hopkins University Whiting School of EngineeringBaltimore CARLA CSU California State University, Stanislaus Turlock, CaliforniaRose-Hulman Rose-Hulman Institute of Technology Terre Haute, IndianaLunar PathfindersAmerican Public University SystemCharles Town, West Virginia Lunar Autonomy Challenge digital simulation of lunar surface activity using a digital twin of NASA’s ISRU Pilot ExcavatorJohns Hopkins Applied Physics Laboratory Keep Exploring Discover More Topics From NASA
Space Technology Mission Directorate
NASA’s Lunar Surface Innovation Initiative
Game Changing Development Projects
Game Changing Development projects aim to advance space technologies, focusing on advancing capabilities for going to and living in space.
ISRU Pilot Excavator
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.