Members Can Post Anonymously On This Site
How do satellites monitor the ozone layer?
-
Similar Topics
-
By NASA
NASA science and American industry have worked hand-in-hand for more than 60 years, transforming novel technologies created with NASA research into commercial products like cochlear implants, memory-foam mattresses, and more. Now, a NASA-funded device for probing the interior of storm systems has been made a key component of commercial weather satellites.
The novel atmospheric sounder was originally developed for NASA’s TROPICS (short for Time-Resolved Observations of Precipitation structure and storm Intensity with a Constellation of SmallSats), which launched in 2023. Boston-based weather technology company Tomorrow.io integrated the same instrument design into some of its satellites.
NASA’s TROPICS instrument. TROPICS pioneered a novel, compact atmospheric sound now flying aboard a fleet of commercial small satellites created by the weather technology company Tomorrow.io.Credit: Blue Canyon Technologies Atmospheric sounders allow researchers to gather data describing humidity, temperature, and wind speed — important factors for weather forecasting and atmospheric analysis. From low-Earth orbit, these devices help make air travel safer, shipping more efficient, and severe weather warnings more reliable.
Novel tools for Observing Storm Systems
In the early 2000s, meteorologists and atmospheric chemists were eager to find a new science tool that could peer deep inside storm systems and do so multiple times a day. At the same time, CubeSat constellations (groupings of satellites each no larger than a shoebox) were emerging as promising, low-cost platforms for increasing the frequency with which individual sensors could pass over fast-changing storms, which improves the accuracy of weather models.
The challenge was to create an instrument small enough to fit aboard a satellite the size of a toaster, yet powerful enough to observe the innermost mechanisms of storm development. Preparing these technologies required years of careful development that was primarily supported by NASA’s Earth Science Division.
William Blackwell and his team at MIT Lincoln Laboratory in Cambridge, Massachusetts, accepted this challenge and set out to miniaturize vital components of atmospheric sounders. “These were instruments the size of a washing machine, flying on platforms the size of a school bus,” said Blackwell, the principal investigator for TROPICS. “How in the world could we shrink them down to the size of a coffee mug?”
With a 2010 award from NASA’s Earth Science Technology Office (ESTO), Blackwell’s team created an ultra-compact microwave receiver, a component that can sense the microwave radiation within the interior of storms.
The Lincoln Lab receiver weighed about a pound and took up less space than a hockey puck. This innovation paved the way for a complete atmospheric sounder instrument small enough to fly aboard a CubeSat. “The hardest part was figuring out how to make a compact back-end to this radiometer,” Blackwell said. “So without ESTO, this would not have happened. That initial grant was critical.”
In 2023, that atmospheric sounder was sent into space aboard four TROPICS CubeSats, which have been collecting torrents of data on the interior of severe storms around the world.
Transition to Industry
By the time TROPICS launched, Tomorrow.io developers knew they wanted Blackwell’s microwave receiver technology aboard their own fleet of commercial weather satellites. “We looked at two or three different options, and TROPICS was the most capable instrument of those we looked at,” said Joe Munchak, a senior atmospheric data scientist at Tomorrow.io.
In 2022, the company worked with Blackwell to adapt his team’s design into a CubeSat platform about twice the size of the one used for TROPICS. A bigger platform, Blackwell explained, meant they could bolster the sensor’s capabilities.
“When we first started conceptualizing this, the 3-unit CubeSat was the only game in town. Now we’re using a 6-unit CubeSat, so we have room for onboard calibration,” which improves the accuracy and reliability of gathered data, Blackwell said.
Tomorrow.io’s first atmospheric sounders, Tomorrow-S1 and Tomorrow-S2, launched in 2024. By the end of 2025, the company plans to have a full constellation of atmospheric sounders in orbit. The company also has two radar instruments that were launched in 2023 and were influenced by NASA’s RainCube instrument — the first CubeSat equipped with an active precipitation radar.
More CubeSats leads to more accurate weather data because there are more opportunities each day — revisits — to collect data. “With a fleet size of 18, we can easily get our revisit rate down to under an hour, maybe even 40 to 45 minutes in most places. It has a huge impact on short-term forecasts,” Munchak said.
Having access to an atmospheric sounder that had already flown in space and had more than 10 years of testing was extremely useful as Tomorrow.io planned its fleet. “It would not have been possible to do this nearly as quickly or nearly as affordably had NASA not paved the way,” said Jennifer Splaingard, Tomorrow.io’s senior vice president for space and sensors.
A Cycle of Innovation
The relationship between NASA and industry is symbiotic. NASA and its grantees can drive innovation and test new tools, equipping American businesses with novel technologies they may otherwise be unable to develop on their own. In exchange, NASA gains access to low-cost data sets that can supplement information gathered through its larger science missions.
Tomorrow.io was among eight companies selected by NASA’s Commercial SmallSat Data Acquisition (CSDA) program in September 2024 to equip NASA with data that will help improve weather forecasting models. “It really is a success story of technology transfer. It’s that sweet spot, where the government partners with tech companies to really take an idea, a proven concept, and run with it,” Splaingard said.
By Gage Taylor
NASA’s Goddard Space Flight Center, Greenbelt, Md.
Share
Details
Last Updated Sep 02, 2025 Related Terms
Earth Hurricanes & Typhoons TROPICS (Time-Resolved Observations of Precipitation Structure and Storm Intensity with a Constellation of Smallsats) View the full article
-
By NASA
5 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Cloud cover can keep optical instruments on satellites from clearly capturing Earth’s surface. Still in testing, JPL’s Dynamic Targeting uses AI to avoid imaging clouds, yielding a higher proportion of usable data, and to focus on phenomena like this 2015 volcanic eruption in Indonesia Landsat 8 captured.NASA/USGS A technology called Dynamic Targeting could enable spacecraft to decide, autonomously and within seconds, where to best make science observations from orbit.
In a recent test, NASA showed how artificial intelligence-based technology could help orbiting spacecraft provide more targeted and valuable science data. The technology enabled an Earth-observing satellite for the first time to look ahead along its orbital path, rapidly process and analyze imagery with onboard AI, and determine where to point an instrument. The whole process took less than 90 seconds, without any human involvement.
Called Dynamic Targeting, the concept has been in development for more than a decade at NASA’s Jet Propulsion Laboratory in Southern California. The first of a series of flight tests occurred aboard a commercial satellite in mid-July. The goal: to show the potential of Dynamic Targeting to enable orbiters to improve ground imaging by avoiding clouds and also to autonomously hunt for specific, short-lived phenomena like wildfires, volcanic eruptions, and rare storms.
This graphic shows how JPL’s Dynamic Targeting uses a lookahead sensor to see what’s on a satellite’s upcoming path. Onboard algorithms process the sensor’s data, identifying clouds to avoid and targets of interest for closer observation as the satellite passes overhead.NASA/JPL-Caltech “The idea is to make the spacecraft act more like a human: Instead of just seeing data, it’s thinking about what the data shows and how to respond,” says Steve Chien, a technical fellow in AI at JPL and principal investigator for the Dynamic Targeting project. “When a human sees a picture of trees burning, they understand it may indicate a forest fire, not just a collection of red and orange pixels. We’re trying to make the spacecraft have the ability to say, ‘That’s a fire,’ and then focus its sensors on the fire.”
Avoiding Clouds for Better Science
This first flight test for Dynamic Targeting wasn’t hunting specific phenomena like fires — that will come later. Instead, the point was avoiding an omnipresent phenomenon: clouds.
Most science instruments on orbiting spacecraft look down at whatever is beneath them. However, for Earth-observing satellites with optical sensors, clouds can get in the way as much as two-thirds of the time, blocking views of the surface. To overcome this, Dynamic Targeting looks 300 miles (500 kilometers) ahead and has the ability to distinguish between clouds and clear sky. If the scene is clear, the spacecraft images the surface when passing overhead. If it’s cloudy, the spacecraft cancels the imaging activity to save data storage for another target.
“If you can be smart about what you’re taking pictures of, then you only image the ground and skip the clouds. That way, you’re not storing, processing, and downloading all this imagery researchers really can’t use,” said Ben Smith of JPL, an associate with NASA’s Earth Science Technology Office, which funds the Dynamic Targeting work. “This technology will help scientists get a much higher proportion of usable data.”
How Dynamic Targeting Works
The testing is taking place on CogniSAT-6, a briefcase-size CubeSat that launched in March 2024. The satellite — designed, built, and operated by Open Cosmos — hosts a payload designed and developed by Ubotica featuring a commercially available AI processor. While working with Ubotica in 2022, Chien’s team conducted tests aboard the International Space Station running algorithms similar to those in Dynamic Targeting on the same type of processor. The results showed the combination could work for space-based remote sensing.
Since CogniSAT-6 lacks an imager dedicated to looking ahead, the spacecraft tilts forward 40 to 50 degrees to point its optical sensor, a camera that sees both visible and near-infrared light. Once look-ahead imagery has been acquired, Dynamic Targeting’s advanced algorithm, trained to identify clouds, analyzes it. Based on that analysis, the Dynamic Targeting planning software determines where to point the sensor for cloud-free views. Meanwhile, the satellite tilts back toward nadir (looking directly below the spacecraft) and snaps the planned imagery, capturing only the ground.
This all takes place in 60 to 90 seconds, depending on the original look-ahead angle, as the spacecraft speeds in low Earth orbit at nearly 17,000 mph (7.5 kilometers per second).
What’s Next
With the cloud-avoidance capability now proven, the next test will be hunting for storms and severe weather — essentially targeting clouds instead of avoiding them. Another test will be to search for thermal anomalies like wildfires and volcanic eruptions. The JPL team developed unique algorithms for each application.
“This initial deployment of Dynamic Targeting is a hugely important step,” Chien said. “The end goal is operational use on a science mission, making for a very agile instrument taking novel measurements.”
There are multiple visions for how that could happen — possibly even on spacecraft exploring the solar system. In fact, Chien and his JPL colleagues drew some inspiration for their Dynamic Targeting work from another project they had also worked on: using data from ESA’s (the European Space Agency’s) Rosetta orbiter to demonstrate the feasibility of autonomously detecting and imaging plumes emitted by comet 67P/Churyumov-Gerasimenko.
On Earth, adapting Dynamic Targeting for use with radar could allow scientists to study dangerous extreme winter weather events called deep convective ice storms, which are too rare and short-lived to closely observe with existing technologies. Specialized algorithms would identify these dense storm formations with a satellite’s look-ahead instrument. Then a powerful, focused radar would pivot to keep the ice clouds in view, “staring” at them as the spacecraft speeds by overhead and gathers a bounty of data over six to eight minutes.
Some ideas involve using Dynamic Targeting on multiple spacecraft: The results of onboard image analysis from a leading satellite could be rapidly communicated to a trailing satellite, which could be tasked with targeting specific phenomena. The data could even be fed to a constellation of dozens of orbiting spacecraft. Chien is leading a test of that concept, called Federated Autonomous MEasurement, beginning later this year.
How AI supports Mars rover science Autonomous robot fleet could measure ice shelf melt Ocean world robot swarm prototype gets a swim test News Media Contact
Melissa Pamer
Jet Propulsion Laboratory, Pasadena, Calif.
626-314-4928
melissa.pamer@jpl.nasa.gov
2025-094
Share
Details
Last Updated Jul 24, 2025 Related Terms
Earth Science Earth Science Technology Office Jet Propulsion Laboratory Explore More
5 min read NASA Shares How to Save Camera 370-Million-Miles Away Near Jupiter
Article 3 days ago 2 min read GLOBE-Trotting Science Lands in Chesapeake with NASA eClips
On June 16-17, 2025, 50 students at Camp Young in Chesapeake, Virginia traded their usual…
Article 3 days ago 6 min read 5 Things to Know About Powerful New U.S.-India Satellite, NISAR
Article 3 days ago Keep Exploring Discover Related Topics
Missions
Humans in Space
Climate Change
Solar System
View the full article
-
By NASA
Ozone high in the stratosphere protects us from the Sun’s ultraviolet light. But ozone near the ground is a pollutant that harms people and plants. The San Joaquin Valley has some of the most polluted air in the country, and NASA scientists with the new Ozone Where We Live (OWWL) project are working to measure ozone and other pollutants there. They need your help!
Do you live or work in Bakersfield, CA? Sign up to host an ozone sensor! It’s like a big lunch box that you place in your yard, but it’s not packed with tuna and crackers. It’s filled with sensors that measure temperature and humidity and sniff out dangerous gases like methane, carbon monoxide, carbon dioxide, and of course, ozone.
Can you fly a plane? Going to the San Joaquin Valley? Sign up to take an ozone sensor on your next flight! You can help measure ozone levels in layers of the atmosphere that are hard for satellites to investigate. Scientists will combine the data you take with data from NASA’s TEMPO satellite to improve air quality models and measurements within the region. Find out more here or email: Emma.l.yates@nasa.gov
Join the Ozone Where We Live (OWWL) project and help NASA scientists protect the people of the San Joaquin Valley! Credit: Emma Yates Share
Details
Last Updated Jun 24, 2025 Related Terms
Citizen Science Earth Science Division Tropospheric Emissions: Monitoring of Pollution (TEMPO) Explore More
4 min read c-FIRST Team Sets Sights on Future Fire-observing Satellite Constellations
Article
3 weeks ago
2 min read Summer Students Scan the Radio Skies with SunRISE
Article
4 weeks ago
2 min read Space Cloud Watch Needs Your Photos of Night-Shining Clouds
Article
1 month ago
View the full article
-
By NASA
5 min read
Atomic Layer Processing Coating Techniques Enable Missions to See Further into the Ultraviolet
Astrophysics observations at ultraviolet (UV) wavelengths often probe the most dynamic aspects of the universe. However, the high energy of ultraviolet photons means that their interaction with the materials that make up an observing instrument are less efficient, resulting in low overall throughput. New approaches in the development of thin film coatings are addressing this shortcoming by engineering the coatings of instrument structures at the atomic scale.
Researchers at the NASA Jet Propulsion Laboratory (JPL) are employing atomic layer deposition (ALD) and atomic layer etching (ALE) to enable new coating technologies for instruments measuring ultraviolet light. Conventional optical coatings largely rely on physical vapor deposition (PVD) methods like evaporation, where the coating layer is formed by vaporizing the source material and then condensing it onto the intended substrate. In contrast, ALD and ALE rely on a cyclic series of self-limiting chemical reactions that result in the deposition (or removal) of material one atomic layer at a time. This self-limiting characteristic results in a coating or etchings that are conformal over arbitrary shapes with precisely controlled layer thickness determined by the number of ALD or ALE cycles performed.
The ALD and ALE techniques are common in the semiconductor industry where they are used to fabricate high-performance transistors. Their use as an optical coating method is less common, particularly at ultraviolet wavelengths where the choice of optical coating material is largely restricted to metal fluorides instead of more common metal oxides, due to the larger optical band energy of fluoride materials, which minimizes absorption losses in the coatings. Using an approach based on co-reaction with hydrogen fluoride, the team at JPL has developed a variety of fluoride-based ALD and ALE processes.
(left) The Supernova remnants and Proxies for ReIonization Testbed Experiment (SPRITE) CubeSat primary mirror inside the ALD coating facility at JPL, the mirror is 18 cm on the long and is the largest optic coated in this chamber to-date. (right) Flight optic coating inside JPL ALD chamber for Pioneers Aspera Mission. Like SPRITE, the Aspera coating combines a lithium fluoride process developed at NASA GSFC with thin ALD encapsulation of magnesium fluoride at JPL. Image Credit: NASA-JPL In addition to these metal-fluoride materials, layers of aluminum are often used to construct structures like reflective mirrors and bandpass filters for instruments operating in the UV. Although aluminum has high intrinsic UV reflectance, it also readily forms a surface oxide that strongly absorbs UV light. The role of the metal fluoride coating is then to protect the aluminum surface from oxidation while maintaining enough transparency to create a mirror with high reflectance.
The use of ALD in this context has initially been pursued in the development of telescope optics for two SmallSat astrophysics missions that will operate in the UV: the Supernova remnants and Proxies for ReIonization Testbed Experiment (SPRITE) CubeSat mission led by Brian Fleming at the University of Colorado Boulder, and the Aspera mission led by Carlos Vargas at the University of Arizona. The mirrors for SPRITE and Aspera have reflective coatings that utilize aluminum protected by lithium fluoride using a novel PVD processes developed at NASA Goddard Space Flight Center, and an additional very thin top coating of magnesium fluoride deposited via ALD.
Team member John Hennessy prepares to load a sample wafer in the ALD coating chamber at JPL. Image Credit: NASA JPL The use of lithium fluoride enables SPRITE and Aspera to “see” further into the UV than other missions like NASA’s Hubble Space Telescope, which uses only magnesium fluoride to protect its aluminum mirror surfaces. However, a drawback of lithium fluoride is its sensitivity to moisture, which in some cases can cause the performance of these mirror coatings to degrade on the ground prior to launch. To circumvent this issue, very thin layers (~1.5 nanometers) of magnesium fluoride were deposited by ALD on top of the lithium fluoride on the SPRITE and Aspera mirrors. The magnesium fluoride layers are thin enough to not strongly impact the performance of the mirror at the shortest wavelengths, but thick enough to enhance the stability against humidity during ground phases of the missions. Similar approaches are being considered for the mirror coatings of the future NASA flagship Habitable Worlds Observatory (HWO).
Multilayer structures of aluminum and metal fluorides can also function as bandpass filters (filters that allow only signals within a selected range of wavelengths to pass through to be recorded) in the UV. Here, ALD is an attractive option due to the inherent repeatability and precise thickness control of the process. There is currently no suitable ALD process to deposit aluminum, and so additional work by the JPL team has explored the development of a custom vacuum coating chamber that combines the PVD aluminum and ALD fluoride processes described above. This system has been used to develop UV bandpass filters that can be deposited directly onto imaging sensors like silicon (Si) CCDs. These coatings can enable such sensors to operate with high UV efficiency, but low sensitivity to longer wavelength visible photons that would otherwise add background noise to the UV observations.
Structures composed of multilayer aluminum and metal fluoride coatings have recently been delivered as part of a UV camera to the Star-Planet Activity Research CubeSat (SPARCS) mission led by Evgenya Shkolnik at Arizona State University. The JPL-developed camera incorporates a delta-doped Si CCD with the ALD/PVD filter coating on the far ultraviolet channel, yielding a sensor with high efficiency in a band centered near 160 nm with low response to out-of-band light.
A prototype of a back-illuminated CCD incorporating a multi-layer metal-dielectric bandpass filter coating deposited by a combination of thermal evaporation and ALD. This coating combined with JPL back surface passivation approaches enable the Si CCD to operate with high UV efficiency while rejecting longer wavelength light. Image credit: NASA JPL Next, the JPL team that developed these coating processes plans to focus on implementing a similar bandpass filter on an array of larger-format Si Complementary Metal-Oxide-Semiconductor (CMOS) sensors for the recently selected NASA Medium-Class Explorer (MIDEX) UltraViolet EXplorer (UVEX) mission led by Fiona Harrison at the California Institute of Technology, which is targeted to launch in the early 2030s.
For additional details, see the entry for this project on NASA TechPort
Project Lead: Dr. John Hennessy, Jet Propulsion Laboratory (JPL)
Share
Details
Last Updated Mar 18, 2025 Related Terms
Technology Highlights Astrophysics Astrophysics Division Jet Propulsion Laboratory Science-enabling Technology Explore More
5 min read NASA’s Webb Images Young, Giant Exoplanets, Detects Carbon Dioxide
Article
1 day ago
2 min read Hubble Sees a Spiral and a Star
Article
4 days ago
4 min read Discovery Alert: ‘Super-Earth’ Swings from Super-Heated to Super-Chill
Article
7 days ago
View the full article
-
By European Space Agency
Last night a crucial step in the European Space Agency’s eclipse-making Proba-3 mission was completed: the two spacecraft, flying jointly since launch, have successfully separated. This leaves them ready to begin their cosmic dance in the world’s first-ever precision formation-flying mission.
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.