Jump to content

Recommended Posts

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By European Space Agency
      Europe’s forests play a crucial role in removing carbon dioxide from the atmosphere, but research led by the European Commission’s Joint Research Centre has found their capacity to absorb carbon dioxide has declined in the past decade.
      View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Cloud cover can keep optical instruments on satellites from clearly capturing Earth’s surface. Still in testing, JPL’s Dynamic Targeting uses AI to avoid imaging clouds, yielding a higher proportion of usable data, and to focus on phenomena like this 2015 volcanic eruption in Indonesia Landsat 8 captured.NASA/USGS A technology called Dynamic Targeting could enable spacecraft to decide, autonomously and within seconds, where to best make science observations from orbit.
      In a recent test, NASA showed how artificial intelligence-based technology could help orbiting spacecraft provide more targeted and valuable science data. The technology enabled an Earth-observing satellite for the first time to look ahead along its orbital path, rapidly process and analyze imagery with onboard AI, and determine where to point an instrument. The whole process took less than 90 seconds, without any human involvement.
      Called Dynamic Targeting, the concept has been in development for more than a decade at NASA’s Jet Propulsion Laboratory in Southern California. The first of a series of flight tests occurred aboard a commercial satellite in mid-July. The goal: to show the potential of Dynamic Targeting to enable orbiters to improve ground imaging by avoiding clouds and also to autonomously hunt for specific, short-lived phenomena like wildfires, volcanic eruptions, and rare storms.
      This graphic shows how JPL’s Dynamic Targeting uses a lookahead sensor to see what’s on a satellite’s upcoming path. Onboard algorithms process the sensor’s data, identifying clouds to avoid and targets of interest for closer observation as the satellite passes overhead.NASA/JPL-Caltech “The idea is to make the spacecraft act more like a human: Instead of just seeing data, it’s thinking about what the data shows and how to respond,” says Steve Chien, a technical fellow in AI at JPL and principal investigator for the Dynamic Targeting project. “When a human sees a picture of trees burning, they understand it may indicate a forest fire, not just a collection of red and orange pixels. We’re trying to make the spacecraft have the ability to say, ‘That’s a fire,’ and then focus its sensors on the fire.”
      Avoiding Clouds for Better Science
      This first flight test for Dynamic Targeting wasn’t hunting specific phenomena like fires — that will come later. Instead, the point was avoiding an omnipresent phenomenon: clouds.
      Most science instruments on orbiting spacecraft look down at whatever is beneath them. However, for Earth-observing satellites with optical sensors, clouds can get in the way as much as two-thirds of the time, blocking views of the surface. To overcome this, Dynamic Targeting looks 300 miles (500 kilometers) ahead and has the ability to distinguish between clouds and clear sky. If the scene is clear, the spacecraft images the surface when passing overhead. If it’s cloudy, the spacecraft cancels the imaging activity to save data storage for another target.
      “If you can be smart about what you’re taking pictures of, then you only image the ground and skip the clouds. That way, you’re not storing, processing, and downloading all this imagery researchers really can’t use,” said Ben Smith of JPL, an associate with NASA’s Earth Science Technology Office, which funds the Dynamic Targeting work. “This technology will help scientists get a much higher proportion of usable data.”
      How Dynamic Targeting Works
      The testing is taking place on CogniSAT-6, a briefcase-size CubeSat that launched in March 2024. The satellite — designed, built, and operated by Open Cosmos — hosts a payload designed and developed by Ubotica featuring a commercially available AI processor. While working with Ubotica in 2022, Chien’s team conducted tests aboard the International Space Station running algorithms similar to those in Dynamic Targeting on the same type of processor. The results showed the combination could work for space-based remote sensing.
      Since CogniSAT-6 lacks an imager dedicated to looking ahead, the spacecraft tilts forward 40 to 50 degrees to point its optical sensor, a camera that sees both visible and near-infrared light. Once look-ahead imagery has been acquired, Dynamic Targeting’s advanced algorithm, trained to identify clouds, analyzes it. Based on that analysis, the Dynamic Targeting planning software determines where to point the sensor for cloud-free views. Meanwhile, the satellite tilts back toward nadir (looking directly below the spacecraft) and snaps the planned imagery, capturing only the ground.
      This all takes place in 60 to 90 seconds, depending on the original look-ahead angle, as the spacecraft speeds in low Earth orbit at nearly 17,000 mph (7.5 kilometers per second).
      What’s Next
      With the cloud-avoidance capability now proven, the next test will be hunting for storms and severe weather — essentially targeting clouds instead of avoiding them. Another test will be to search for thermal anomalies like wildfires and volcanic eruptions. The JPL team developed unique algorithms for each application.
      “This initial deployment of Dynamic Targeting is a hugely important step,” Chien said. “The end goal is operational use on a science mission, making for a very agile instrument taking novel measurements.”
      There are multiple visions for how that could happen — possibly even on spacecraft exploring the solar system. In fact, Chien and his JPL colleagues drew some inspiration for their Dynamic Targeting work from another project they had also worked on: using data from ESA’s (the European Space Agency’s) Rosetta orbiter to demonstrate the feasibility of autonomously detecting and imaging plumes emitted by comet 67P/Churyumov-Gerasimenko.
      On Earth, adapting Dynamic Targeting for use with radar could allow scientists to study dangerous extreme winter weather events called deep convective ice storms, which are too rare and short-lived to closely observe with existing technologies. Specialized algorithms would identify these dense storm formations with a satellite’s look-ahead instrument. Then a powerful, focused radar would pivot to keep the ice clouds in view, “staring” at them as the spacecraft speeds by overhead and gathers a bounty of data over six to eight minutes.
      Some ideas involve using Dynamic Targeting on multiple spacecraft: The results of onboard image analysis from a leading satellite could be rapidly communicated to a trailing satellite, which could be tasked with targeting specific phenomena. The data could even be fed to a constellation of dozens of orbiting spacecraft. Chien is leading a test of that concept, called Federated Autonomous MEasurement, beginning later this year.
      How AI supports Mars rover science Autonomous robot fleet could measure ice shelf melt Ocean world robot swarm prototype gets a swim test News Media Contact
      Melissa Pamer
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-314-4928
      melissa.pamer@jpl.nasa.gov
      2025-094
      Share
      Details
      Last Updated Jul 24, 2025 Related Terms
      Earth Science Earth Science Technology Office Jet Propulsion Laboratory Explore More
      5 min read NASA Shares How to Save Camera 370-Million-Miles Away Near Jupiter
      Article 3 days ago 2 min read GLOBE-Trotting Science Lands in Chesapeake with NASA eClips
      On June 16-17, 2025, 50 students at Camp Young in Chesapeake, Virginia traded their usual…
      Article 3 days ago 6 min read 5 Things to Know About Powerful New U.S.-India Satellite, NISAR
      Article 3 days ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      A team works together on their project during the 2024 NASA Space Apps Challenge event in in Arequipa, Peru. Teams have two days to respond to the challenges and submit their project for the chance to win one of 10 global awards. NASA invites innovators of all ages to register for the NASA Space Apps Challenge, held on Oct. 4-5. The 2025 theme is Learn, Launch, Lead, and participants will work alongside a vibrant community of scientists, technologists, and storytellers at more than 450 events worldwide. Participants can expect to learn skills to succeed in STEM fields, launch ideas that transform NASA’s open data into actionable tools, and lead their communities in driving technological innovation.
       
      During the NASA Space Apps Challenge, participants in the U.S. and around the world gather at hundreds of in-person and virtual events to address challenges authored by subject matter experts across NASA divisions. These challenges range in complexity and topic, tasking participants with everything from creating machine learning models and leveraging artificial intelligence, to improving access to NASA research, to designing sustainable recycling systems for Mars, and to developing tools to evaluate local air quality here on Earth.
       
      Dr. Yoseline Angel Lopez, a former space apps challenge winner and now an assistant research scientist at NASA’s Goddard Spaceflight Center in Greenbelt, Maryland, can attest that the opportunity to Learn, Launch, Lead goes far beyond the hackathon.   
       
      “The NASA Space Apps Challenge gave me and my team a meaningful opportunity to apply science to real-world problems and gain validation from NASA scientists and industry experts,” said Angel.
       
      In 2021, her team’s winning web-app prototype was adopted by Colombia’s Ministry of Agriculture, connecting smallholder farmers with local buyers. The platform also supported agricultural land-use monitoring using satellite imagery.
       
      After the hackathon, project submissions are judged by NASA and space agency experts. Winners are selected for one of 10 global awards.
       
      “Participating in the hackathon is exciting on its own. But when your project can lead to greater opportunities and make a difference in your community, that’s a dream come true,” said Angel. She will return to the 2025 hackathon as a NASA subject matter expert and challenge author, giving a Golden Age of innovators the opportunity to make a difference in their communities through the use of data from NASA and 14 space agency partners.
       
      This year’s partners include: Bahrain Space Agency; Brazilian Space Agency; CSA (Canadian Space Agency); ESA (European Space Agency); ISRO (Indian Space Research Organisation); Italian Space Agency; JAXA (Japan Aerospace Exploration Agency); Mohammed Bin Rashid Space Centre of the United Arab Emirates; National Space Activities Commission of Argentina;  Paraguayan Space Agency; South African National Space Agency; Spanish Space Agency; Turkish Space Agency; and the UK Space Agency.
       
      NASA Space Apps is funded by NASA’s Earth Science Division through a contract with Booz Allen Hamilton, Mindgrub, and SecondMuse.
       
      We invite you to register for the 2025 NASA Space Apps Challenge and choose a virtual or in-person event near you at:

      https://www.spaceappschallenge.org
      Find videos about Space Apps at:
      youtube.com/c/NASASpaceAppsChallenge
      Social Media
      Stay up to date with #SpaceApps by following these accounts:
      Facebook logo @spaceappschallenge @SpaceApps Instagram logo @nasa_spaceapps Share
      Details
      Last Updated Jul 17, 2025 Related Terms
      Prizes, Challenges, and Crowdsourcing Program Earth Earth Science Division General Get Involved Learning Resources Explore More
      6 min read NASA Program Builds Bridge From Military to Civilian Careers for Johnson Team Members
      Article 7 hours ago 3 min read NASA Citizen Science and Your Career: Stories of Exoplanet Watch Volunteers
      Doing NASA Science brings many rewards. But can taking part in NASA citizen science help…
      Article 1 day ago 2 min read Ejection Mechanism Design for the SPEED Test Architecture Challenge
      Article 1 day ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By Space Force
      Registration is now open for the United States Space Force’ s second annual Artificial Intelligence Challenge.
      View the full article
    • By NASA
      At COSI’s Big Science Celebration on Sunday, May 4, 2025, a young visitor uses one of NASA Glenn Research Center’s virtual reality headsets to immerse herself in a virtual environment. Credit: NASA/Lily Hammel  NASA’s Glenn Research Center joined the Center for Science and Industry (COSI) Big Science Celebration on the museum’s front lawn in Columbus, Ohio, on May 4. This event centered on science activities by STEM professionals, researchers, and experts from Central Ohio — and despite chilly, damp weather, it drew more than 20,000 visitors. 
      At COSI’s Big Science Celebration on Sunday, May 4, 2025, a young visitor steps out of the rain and into NASA Glenn Research Center’s booth to check out the Graphics and Visualization Lab’s augmented reality fluid flow table that allows users to virtually explore a model of the International Space Station. Credit: NASA/Lily Hammel  NASA’s 10-by-80-foot tent housed a variety of information booths and hands-on demonstrations to introduce guests to the vital research being performed at the Cleveland center. Popular attractions included a mini wind tunnel and multiple augmented and virtual reality demonstrations. Visitors also engaged through tangram puzzles and a cosmic selfie station. NASA Glenn’s astronaut mascot made several appearances to the delight of young and old alike.   
      Return to Newsletter View the full article
  • Check out these Videos

×
×
  • Create New...