Jump to content

Recommended Posts

Posted
Generating 3D cloud maps

Launched in May 2024, ESA’s EarthCARE satellite is nearing the end of its commissioning phase with the release of its first data on clouds and aerosols expected early next year. In the meantime, an international team of scientists has found an innovative way of applying artificial intelligence to other satellite data to yield 3D profiles of clouds.

This is particularly news for those eagerly awaiting data from EarthCARE in their quest to advance climate science.

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      5 min read
      NASA Launching Rockets Into Radio-Disrupting Clouds
      NASA is launching rockets from a remote Pacific island to study mysterious, high-altitude cloud-like structures that can disrupt critical communication systems. The mission, called Sporadic-E ElectroDynamics, or SEED, opens its three-week launch window from Kwajalein Atoll in the Marshall Islands on Friday, June 13.
      The atmospheric features SEED is studying are known as Sporadic-E layers, and they create a host of problems for radio communications. When they are present, air traffic controllers and marine radio users may pick up signals from unusually distant regions, mistaking them for nearby sources. Military operators using radar to see beyond the horizon may detect false targets — nicknamed “ghosts” — or receive garbled signals that are tricky to decipher. Sporadic-E layers are constantly forming, moving, and dissipating, so these disruptions can be difficult to anticipate.
      An animated illustration depicts Sporadic-E layers forming in the lower portions of the ionosphere, causing radio signals to reflect back to Earth before reaching higher layers of the ionosphere. NASA’s Goddard Space Flight Center/Conceptual Image Lab Sporadic-E layers form in the ionosphere, a layer of Earth’s atmosphere that stretches from about 40 to 600 miles (60 to 1,000 kilometers) above sea level. Home to the International Space Station and most Earth-orbiting satellites, the ionosphere is also where we see the greatest impacts of space weather. Primarily driven by the Sun, space weather causes myriad problems for our communications with satellites and between ground systems. A better understanding of the ionosphere is key to keeping critical infrastructure running smoothly.
      The ionosphere is named for the charged particles, or ions, that reside there. Some of these ions come from meteors, which burn up in the atmosphere and leave traces of ionized iron, magnesium, calcium, sodium, and potassium suspended in the sky. These “heavy metals” are more massive than the ionosphere’s typical residents and tend to sink to lower altitudes, below 90 miles (140 kilometers). Occasionally, they clump together to create dense clusters known as Sporadic-E layers.
      The Perseids meteor shower peaks in mid-August. Meteors like these can deposit metals into Earth’s ionosphere that can help create cloud-like structures called Sporadic-E layers. NASA/Preston Dyches “These Sporadic-E layers are not visible to naked eye, and can only be seen by radars. In the radar plots, some layers appear like patchy and puffy clouds, while others spread out, similar to an overcast sky, which we call blanketing Sporadic-E layer” said Aroh Barjatya, the SEED mission’s principal investigator and a professor of engineering physics at Embry-Riddle Aeronautical University in Daytona Beach, Florida. The SEED team includes scientists from Embry-Riddle, Boston College in Massachusetts, and Clemson University in South Carolina.
      “There’s a lot of interest in predicting these layers and understanding their dynamics because of how they interfere with communications,” Barjatya said.
      A Mystery at the Equator
      Scientists can explain Sporadic-E layers when they form at midlatitudes but not when they appear close to Earth’s equator — such as near Kwajalein Atoll, where the SEED mission will launch.
      In the Northern and Southern Hemispheres, Sporadic-E layers can be thought of as particle traffic jams.
      Think of ions in the atmosphere as miniature cars traveling single file in lanes defined by Earth’s magnetic field lines. These lanes connect Earth end to end — emerging near the South Pole, bowing around the equator, and plunging back into the North Pole.
      A conceptual animation shows Earth’s magnetic field. The blue lines radiating from Earth represent the magnetic field lines that charged particles travel along. NASA’s Goddard Space Flight Center/Conceptual Image Lab At Earth’s midlatitudes, the field lines angle toward the ground, descending through atmospheric layers with varying wind speeds and directions. As the ions pass through these layers, they experience wind shear — turbulent gusts that cause their orderly line to clump together. These particle pileups form Sporadic-E layers.
      But near the magnetic equator, this explanation doesn’t work. There, Earth’s magnetic field lines run parallel to the surface and do not intersect atmospheric layers with differing winds, so Sporadic-E layers shouldn’t form. Yet, they do — though less frequently.
      “We’re launching from the closest place NASA can to the magnetic equator,” Barjatya said, “to study the physics that existing theory doesn’t fully explain.”
      Taking to the Skies
      To investigate, Barjatya developed SEED to study low-latitude Sporadic-E layers from the inside. The mission relies on sounding rockets — uncrewed suborbital spacecraft carrying scientific instruments. Their flights last only a few minutes but can be launched precisely at fleeting targets.
      Beginning the night of June 13, Barjatya and his team will monitor ALTAIR (ARPA Long-Range Tracking and Instrumentation Radar), a high-powered, ground-based radar system at the launch site, for signs of developing Sporadic-E layers. When conditions are right, Barjatya will give the launch command. A few minutes later, the rocket will be in flight.
      The SEED science team and mission management team in front of the ARPA Long-Range Tracking and Instrumentation Radar (ALTAIR). The SEED team will use ALTAIR to monitor the ionosphere for signs of Sporadic-E layers and time the launch. U.S. Army Space and Missile Defense Command On ascent, the rocket will release colorful vapor tracers. Ground-based cameras will track the tracers to measure wind patterns in three dimensions. Once inside the Sporadic-E layer, the rocket will deploy four subpayloads — miniature detectors that will measure particle density and magnetic field strength at multiple points. The data will be transmitted back to the ground as the rocket descends.
      On another night during the launch window, the team will launch a second, nearly identical rocket to collect additional data under potentially different conditions.
      Barjatya and his team will use the data to improve computer models of the ionosphere, aiming to explain how Sporadic-E layers form so close to the equator.
      “Sporadic-E layers are part of a much larger, more complicated physical system that is home to space-based assets we rely on every day,” Barjatya said. “This launch gets us closer to understanding another key piece of Earth’s interface to space.”
      By Miles Hatfield
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share








      Details
      Last Updated Jun 12, 2025 Related Terms
      Heliophysics Goddard Space Flight Center Heliophysics Division Ionosphere Missions NASA Centers & Facilities NASA Directorates Science & Research Science Mission Directorate Sounding Rockets Sounding Rockets Program The Solar System The Sun Uncategorized Wallops Flight Facility Weather and Atmospheric Dynamics Explore More
      9 min read The Earth Observer Editor’s Corner: April–June 2025


      Article


      22 hours ago
      5 min read NASA’s Webb ‘UNCOVERs’ Galaxy Population Driving Cosmic Renovation


      Article


      22 hours ago
      6 min read Frigid Exoplanet in Strange Orbit Imaged by NASA’s Webb


      Article


      2 days ago
      Keep Exploring Discover Related Topics
      Sounding Rockets



      Ionosphere, Thermosphere & Mesosphere



      Space Weather


      Solar flares, coronal mass ejections, solar particle events, and the solar wind form the recipe space weather that affects life…


      Solar System


      View the full article
    • By NASA
      What does it take to gaze through time to our universe’s very first stars and galaxies?  
      NASA answers this question in its new documentary, “Cosmic Dawn: The Untold Story of the James Webb Space Telescope.” The agency’s original documentary, which chronicles the story of the most powerful telescope ever deployed in space, was released Wednesday, June 11.
      Cosmic Dawn offers an unprecedented glimpse into the delicate assembly, rigorous testing, and triumphant launch of NASA’s James Webb Space Telescope. The documentary showcases the complexity involved in creating a telescope capable of peering billions of years into the past.  
      Cosmic Dawn is now available for streaming on NASA’s YouTube, NASA+, and select local theaters. The trailer is available on NASA+ and YouTube.
      Relive the pitfalls and the triumphs of the world’s most powerful space telescope—from developing the idea of an impossible machine to watching with bated breath as it unfolded, hurtling through space a million miles away from Earth. Watch the Documentary on YouTube The film features never-before-seen footage captured by the Webb film crew, offering intimate access to the challenges and triumphs faced by the team at NASA’s Goddard Space Flight Center in Greenbelt, Maryland — the birthplace of Webb.
      “At NASA, we’re thrilled to share the untold story of our James Webb Space Telescope in our new film ‘Cosmic Dawn,’ celebrating not just the discoveries, but the extraordinary people who made it all happen, for the benefit of humanity,” said Rebecca Sirmons, head of NASA+ at the agency’s headquarters in Washington.
      From its vantage point more than a million miles from Earth and a massive sunshield to block the light of our star, Webb’s First Deep Field  the deepest and sharpest infrared images of the universe that the world had seen.
      Webb’s images have dazzled people around the globe, capturing the very faint light of the first stars and galaxies that formed more than 13.5 billion years ago. These are baby pictures from an ancient past when the first objects were turning on and emitting light after the Big Bang. Webb has also given us new insights into black holes, planets both inside and outside of our own solar system, and many other cosmic phenomena.
      Webb was a mission that was going to be spectacular whether that was good or bad — if it failed or was successful. It was always going to make history
      Sophia roberts
      NASA Video Producer
      NASA’s biggest and most powerful space telescope was also its most technically complicated to build. It was harder still to deploy, with more than 300 critical components that had to deploy perfectly. The risks were high in this complicated dance of engineering, but the rewards were so much higher.
      “Webb was a mission that was going to be spectacular whether that was good or bad — if it failed or was successful,” said video producer Sophia Roberts, who chronicled the five years preceding Webb’s launch. “It was always going to make history.”
      NASA scientists like Nobel Laureate Dr. John Mather conceived Webb to look farther and deeper into origins of our universe using cutting edge infrared technology and massive mirrors to collect incredibly rich information about our universe, from the light of the first galaxies to detailed images of planets in our own solar system.
      To achieve this goal, NASA and its partners faced unprecedented hurdles.
      Webb’s development introduced questions that no one had asked before. How do you fit a telescope with the footprint of a tennis court into a rocket? How do you clean 18 sensitive mirrors when a single scratch could render them inoperable? How do you maintain critical testing while hurricane stormwater pours through ceilings?
      A technician inspects the James Webb Space Telescope primary mirrors at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.NASA/Sophia Roberts Cosmic Dawn captures 25 years of formidable design constraints, high-stake assessments, devastating natural disasters, a global pandemic and determined individuals who would let none of that get in the way of getting this monumental observatory to its rightful place in the cosmos.
      “There was nothing easy about Webb at all,” said Webb project manager Bill Ochs. “I don’t care what aspect of the mission you looked at.”
      Viewers will experience a one-of-a-kind journey as NASA and its partners tackle these dilemmas — and more — through ingenuity, teamwork, and unbreakable determination.
      “The inspiration of trying to discover something — to build something that’s never been built before, to discover something that’s never been known before — it keeps us going,” Mather said. “We are pleased and privileged in our position here at NASA to be able to carry out this [purpose] on behalf of the country and the world.”
      Bound by NASA’s 66-year commitment to document and share its work with the public, Cosmic Dawn details every step toward Webb’s launch and science results.
      Learn more at nasa.gov/cosmicdawn By Laine Havens,
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Media Contact:
      Katie Konans,
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Jun 11, 2025 Related Terms
      James Webb Space Telescope (JWST) Goddard Space Flight Center NASA+ View the full article
    • By NASA
      Explore This Section Earth Earth Observer Editor’s Corner Feature Articles Meeting Summaries News Science in the News Calendars In Memoriam Announcements More Archives Conference Schedules Style Guide 8 min read
      ICESat-2 Applications Team Hosts Satellite Bathymetry Workshop
      Introduction
      On September 15, 2018, the NASA Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) mission launched from Vandenberg Air Force Base and began its journey to provide spatially dense and fine precision global measurements of our Earth’s surface elevation. Now in Phase E of NASA’s project life cycle (where the mission is carried out, data is collected and analyzed, and the spacecraft is maintained) of the mission and with almost six years of data collection, the focus shifts to looking ahead to new applications and synergies that may be developed using data from ICESat-2’s one instrument: the Advanced Topographic Laster Altimetry System (ATLAS) – see Figure 1.
      Figure 1. The ATLAS instrument onboard the ICESat-2 platform obtains data using a green, photon-counting lidar that is split into six beams. Figure credit: ICESat-2 Mission Team Satellite-derived bathymetry (SDB) is the process of mapping the seafloor using satellite imagery. The system uses light penetration and reflection in the water to make measurements and estimate variations in ocean floor depths. SDB provides several advantages over other techniques used to map the seafloor (e.g., cost-effectiveness, global coverage, and faster data acquisition). On the other hand, SDB can be limited by water clarity, spatial resolution of the remote sensing measurement, and accuracy, depending on the method and satellite platform/instrument. These limitations notwithstanding, SDB can be used in a wide variety of applications, e.g., coastal zone management, navigation and safety, marine habitat monitoring, and disaster response. ICESat-2 has become a major contributor to SDB, with over 2000 journal article references to this topic to date. Now is the time to think about the state-of-the-art and additional capabilities of SDB for the future.
      To help stimulate such thinking, the NASA ICESat-2 applications team hosted a one-day workshop on March 17, 2025. The workshop focused on the principles and methods for SDB. Held in conjunction with the annual US-Hydro meeting on March 17–20, 2025 at the Wilmington Convention Center in Wilmington, NC, the meeting was hosted by the Hydrographic Society of America. During the workshop the applications team brought together SDB end-users, algorithm developers, operators, and decision makers to discuss the current state and future needs of satellite bathymetry for the community. The objective of this workshop was to provide a space to foster collaboration and conceptualization of SDB applications not yet exploited and to allow for networking to foster synergies and collaborations between different sectors.
      Meeting Overview
      The workshop provided an opportunity for members from government, academia, and private sectors to share their SDB research, applications, and data fusion activities to support decision making and policy support across a wide range of activities. Presenters highlighted SDB principles, methods, and tools for SDB, an introduction of the new ICESat-2 bathymetric data product (ATL24), which is now available through the National Snow and Ice Data Center (NSIDC). During the workshop, the ICESat-2 team delivered a live demonstration of a web service for science data processing. Toward the end of the day, the applications team opened an opportunity for attendees to gather and discuss various topics related to SDB. This portion of the meeting was also available to online participation via Webex Webinars, which broadened the discussion.
      Meeting Goal
      The workshop offered a set of plenary presentations and discussions. During the plenary talks, participants provided an overview of Earth observation and SDB principles, existing methods and tools, an introduction to the newest ICESat-2 bathymetry product ATL24, a demonstration of the use of the webservice SlideRule Earth, and opportunities for open discission, asking questions and developing collaborations.
      Meeting and Summary Format
      The agenda of the SDB workshop was intended to bring together SDB end-users, including ICESat-2 application developers, satellite operators, and decision makers from both government and non-governmental entities to discuss the current state and future needs of the community. The workshop consisted of six sessions that covered various topics of SDB. This report is organized according to the topical focus of the plenary presentations with a brief narrative summary of each presentation included. The discussions that followed were not recorded and are not included in the report. The last section of this report consists of conclusions and future steps. The online meeting agenda includes links to slide decks for many of the presentations.
      Welcoming Remarks
      Aimee Neeley [NASA’s Goddard Space Flight Center (GSFC)/Science Systems and Applications Inc. (SSAI)—ICESat-2 Mission Applications Lead] organized the workshop and served as the host for the event. She opened the day with a brief overview of workshop goals, logistics, and the agenda.
      Overview of Principles of SDB
      Ross Smith [TCarta—Senior Geospatial Scientist] provided an overview of the principles of space-based bathymetry, including the concepts, capabilities, limitations, and methods. Smith began by relaying the history of satellite-derived bathymetry, which began with a collaboration between NASA and Jacques Cousteau in 1975, in which Cousteau used Landsat 1 data, as well as in situ data, to calculate bathymetry to a depth of 22 m (72 ft) in the Bahamas. Smith then described the five broad methodologies and their basic concepts for deriving bathymetry from remote sensing: radar altimetry, bottom reflectance, wave kinematics, laser altimetry, and space-based photogrammetry – see Figure 2. He then introduced the broad methodologies, most commonly used satellite sensors, the capabilities and limitations of each sensor, and the role of ICESat-2 in satellite bathymetry.
      Figure 2. Satellite platforms commonly used for SDB. Figure credit: Ross Smith Review of SDB Methods and Tools
      In this grouping of plenary presentations, representatives from different organizations presented their methods and tools for creating satellite bathymetry products.
      Gretchen Imahori [National Oceanic and Atmospheric Administration’s (NOAA) National Geodetic Survey, Remote Sensing Division] presented the NOAA SatBathy (beta v2.2.3) Tool Update. During this presentation, Imahori provided an overview of the NOAA SatBathy desktop tool, example imagery, updates to the latest version, and the implementation plan for ATL24. The next session included more details about ATL24.
      Minsu Kim [United States Geological Survey (USGS), Earth Resource and Observation Center (EROS)/ Kellogg, Brown & Root (KBR)—Chief Scientist] presented the talk Satellite Derived Bathymetry (SDB) Using OLI/MSI Based-On Physics-Based Algorithm. He provided an overview of an SDB method based on atmospheric and oceanic optical properties. Kim also shared examples of imagery from the SDB product – see Figure 3.
      Figure 3. Three-dimensional renderings of the ocean south of Key West, FL created by adding SDB Digital Elevation Model (physics-based) to a Landsat Operational Land Imager (OLI) scene [top] and a Sentinel-2 Multispectral Imager (MSI) scene [bottom]. Figure credit: Minsu Kim Edward Albada [Earth Observation and Environmental Services GmbH (EOMAP)—Principal] presented the talk Satellite Lidar Bathymetry and EoappTM SLB-Online. The company EOMAP provides various services, including SDB, habitat mapping. For context, Albada provided an overview of EoappTM SDB-Online, a cloud-based software for creating SDB. (EoappTM SDB-online is one of several Eoapp apps and is based on the ICESat-2 photon data product (ATL03). Albada also provided example use cases from Eoapp – see Figure 4.
      Figure 4.A display of the Marquesas Keys (part of the Florida Keys) using satellite lidar bathymetry data from the Eoapp SLB-Online tool from EOMAP. Figure credit: Edward Albada Monica Palaseanu-Lovejoy [USGS GMEG—Research Geographer] presented on a Satellite Triangulated Sea Depth (SaTSeaD): Bathymetry Module for NASA Ames Stereo Pipeline (ASP). She provided an overview of the shallow water bathymetry SaTSeaD module, a photogrammetric method for mapping bathymetry. Palaseanu-Lovejoy presented error statistics and validation procedures. She also shared case study results from Key West, FL; Cocos Lagoon, Guam; and Cabo Rojo, Puerto Rico – see Figure 5.
      Figure 5. Photogrammetric bathymetry map of Cabo Roja, Puerto Rico displayed using the SatSeaD Satellite Triangulated Sea Depth (SaTSeaD): Bathymetry Module for NASA Ames Stereo Pipeline (ASP) module. Figure credit: Monica Palaseanu-Lovejoy Ross Smith presented a presentation on TCarta’s Trident Tools: Approachable SDB|Familiar Environment. During this presentation, Smith provided an overview of the Trident Tools Geoprocessing Toolbox deployed in Esri’s ArcPro. Smith described several use cases for the toolbox in Abu Dhabi, United Arab Emirates; Lucayan Archipelago, Bahamas; and the Red Sea.
      Michael Jasinski [GSFC—Research Hydrologist] presented on The ICESat-2 Inland Water Along Track Algorithm (ATL13). He provided an overview of the ICESat-2 data product ATL13 an inland water product that is distributed by NSIDC. Jasinski described the functionality of the ATL13 semi-empirical algorithm and proceeded to provide examples of its applications with lakes and shallow coastal waters – see Figure 6.
      Figure 6. A graphic of the network of lakes and rivers in North America that are measured by ICESat-2. Figure credit: Michael Jasinski ATL24 Data Product Update
      Christopher Parrish [Oregon State University, School of Civil and Construction Engineering—Professor] presented on ATL24: A New Global ICESat-2 Bathymetric Data Product. Parrish provided an overview of the recently released ATL24 product and described the ATL24 workflow, uncertainty analysis, and applications in shallow coastal waters. Parrish included a case study where ATL24 data were used for bathymetric mapping of Kiriwina Island, Papua New Guinea – see Figure 7.
      Figure 7. ATL24 data observed for Kiriwina Island, Papua New Guinea. Figure credit: Christopher Parrish SlideRule Demo
      J. P. Swinski [GSFC—Computer Engineer] presented SlideRule Earth: Enabling Rapid, Scalable, Open Science. Swinski explained that SlideRule Earth is a public web service that provides access to on-demand processing and visualization of ICESat-2 data. SlideRule can be used to process a subset of ICESat-2 data products, including ATL24 – see Figure 8.
      Figure 8. ATL24 data observed for Sanibel, FL as viewed on the SlideRule Earth public web client. Figure credit: SlideRule Earth SDB Accuracy
      Kim Lowell [University of New Hampshire—Data Analytics Research Scientist and Affiliate Professor] presented on SDB Accuracy Assessment and Improvement Talking Points. During this presentation, Lowell provided examples of accuracy assessments and uncertainty through the comparison of ground measurement of coastal bathymetry to those modeled from satellite data.
      Conclusion
      The ICESat-2 Satellite Bathymetry workshop fostered discussion and collaboration around the topic of SDB methods. The plenary speakers presented the state-of-the-art methods used by different sectors and organizations, including government and private entities. With the release of ATL24, ICESat-2’s new bathymetry product, it was prudent to have a conversation about new and upcoming capabilities for all methods and measurements of satellite bathymetry. Both in-person and online participants were provided with the opportunity to learn, ask questions, and discuss potential applications in their own research. The ICESat-2 applications team hopes to host more events to ensure the growth of this field to maximize the capabilities of ICESat-2 and other Earth Observing systems.
      Share








      Details
      Last Updated Jun 05, 2025 Related Terms
      Earth Science View the full article
    • By NASA
      4 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      A lot can change in a year for Earth’s forests and vegetation, as springtime and rainy seasons can bring new growth, while cooling temperatures and dry weather can bring a dieback of those green colors. And now, a novel type of NASA visualization illustrates those changes in a full complement of colors as seen from space.
      Researchers have now gathered a complete year of PACE data to tell a story about the health of land vegetation by detecting slight variations in leaf colors. Previous missions allowed scientists to observe broad changes in chlorophyll, the pigment that gives plants their green color and also allows them to perform photosynthesis. But PACE now allows scientists to see three different pigments in vegetation: chlorophyll, anthocyanins, and carotenoids. The combination of these three pigments helps scientists pinpoint even more information about plant health. Credit: NASA’s Goddard Space Flight Center NASA’s Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) satellite is designed to view Earth’s microscopic ocean plants in a new lens, but researchers have proved its hyperspectral use over land, as well.
      Previous missions measured broad changes in chlorophyll, the pigment that gives plants their green color and also allows them to perform photosynthesis. Now, for the first time, PACE measurements have allowed NASA scientists and visualizers to show a complete year of global vegetation data using three pigments: chlorophyll, anthocyanins, and carotenoids. That multicolor imagery tells a clearer story about the health of land vegetation by detecting the smallest of variations in leaf colors.
      “Earth is amazing. It’s humbling, being able to see life pulsing in colors across the whole globe,” said Morgaine McKibben, PACE applications lead at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “It’s like the overview effect that astronauts describe when they look down at Earth, except we are looking through our technology and data.”
      Anthocyanins, carotenoids, and chlorophyll data light up North America, highlighting vegetation and its health.Credit: NASA’s Scientific Visualization Studio Anthocyanins are the red pigments in leaves, while carotenoids are the yellow pigments – both of which we see when autumn changes the colors of trees. Plants use these pigments to protect themselves from fluctuations in the weather, adapting to the environment through chemical changes in their leaves. For example, leaves can turn more yellow when they have too much sunlight but not enough of the other necessities, like water and nutrients. If they didn’t adjust their color, it would damage the mechanisms they have to perform photosynthesis.
      In the visualization, the data is highlighted in bright colors: magenta represents anthocyanins, green represents chlorophyll, and cyan represents carotenoids. The brighter the colors are, the more leaves there are in that area. The movement of these colors across the land areas show the seasonal changes over time.
      In areas like the evergreen forests of the Pacific Northwest, plants undergo less seasonal change. The data highlights this, showing comparatively steadier colors as the year progresses.
      The combination of these three pigments helps scientists pinpoint even more information about plant health.
      “Shifts in these pigments, as detected by PACE, give novel information that may better describe vegetation growth, or when vegetation changes from flourishing to stressed,” said McKibben. “It’s just one of many ways the mission will drive increased understanding of our home planet and enable innovative, practical solutions that serve society.”
      The Ocean Color Instrument on PACE collects hyperspectral data, which means it observes the planet in 100 different wavelengths of visible and near infrared light. It is the only instrument – in space or elsewhere – that provides hyperspectral coverage around the globe every one to two days. The PACE mission builds on the legacy of earlier missions, such as Landsat, which gathers higher resolution data but observes a fraction of those wavelengths.
      In a paper recently published in Remote Sensing Letters, scientists introduced the mission’s first terrestrial data products.
      “This PACE data provides a new view of Earth that will improve our understanding of ecosystem dynamics and function,” said Fred Huemmrich, research professor at the University of Maryland, Baltimore County, member of the PACE science and applications team, and first author of the paper. “With the PACE data, it’s like we’re looking at a whole new world of color. It allows us to describe pigment characteristics at the leaf level that we weren’t able to do before.”
      As scientists continue to work with these new data, available on the PACE website, they’ll be able to incorporate it into future science applications, which may include forest monitoring or early detection of drought effects.
      By Erica McNamee
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Jun 05, 2025 EditorKate D. RamsayerContactKate D. Ramsayerkate.d.ramsayer@nasa.gov Related Terms
      Earth Goddard Space Flight Center PACE (Plankton, Aerosol, Cloud, Ocean Ecosystem) Explore More
      4 min read Tundra Vegetation to Grow Taller, Greener Through 2100, NASA Study Finds
      Article 10 months ago 8 min read NASA Researchers Study Coastal Wetlands, Champions of Carbon Capture
      In the Florida Everglades, NASA’s BlueFlux Campaign investigates the relationship between tropical wetlands and greenhouse…
      Article 3 months ago 5 min read NASA Takes to the Air to Study Wildflowers
      Article 2 months ago View the full article
    • By NASA
      ESA/Hubble & NASA, C. Murray This NASA/ESA Hubble Space Telescope image features a sparkling cloudscape from one of the Milky Way’s galactic neighbors, a dwarf galaxy called the Large Magellanic Cloud. Located 160,000 light-years away in the constellations Dorado and Mensa, the Large Magellanic Cloud is the largest of the Milky Way’s many small satellite galaxies.
      This view of dusty gas clouds in the Large Magellanic Cloud is possible thanks to Hubble’s cameras, such as the Wide Field Camera 3 (WFC3) that collected the observations for this image. WFC3 holds a variety of filters, and each lets through specific wavelengths, or colors, of light. This image combines observations made with five different filters, including some that capture ultraviolet and infrared light that the human eye cannot see.
      The wispy gas clouds in this image resemble brightly colored cotton candy. When viewing such a vividly colored cosmic scene, it is natural to wonder whether the colors are ‘real’. After all, Hubble, with its 7.8-foot-wide (2.4 m) mirror and advanced scientific instruments, doesn’t bear resemblance to a typical camera! When image-processing specialists combine raw filtered data into a multi-colored image like this one, they assign a color to each filter. Visible-light observations typically correspond to the color that the filter allows through. Shorter wavelengths of light such as ultraviolet are usually assigned blue or purple, while longer wavelengths like infrared are typically red.
      This color scheme closely represents reality while adding new information from the portions of the electromagnetic spectrum that humans cannot see. However, there are endless possible color combinations that can be employed to achieve an especially aesthetically pleasing or scientifically insightful image.
      Learn how Hubble images are taken and processed.
      Text credit: ESA/Hubble
      Image credit: ESA/Hubble & NASA, C. Murray
      View the full article
  • Check out these Videos

×
×
  • Create New...