Jump to content

NASA’s Upgraded Hyperwall Offers Improved Data Visualization


Recommended Posts

  • Publishers
Posted

1 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

acd24-0072-012.jpg?w=2048
NAS visualization & data sciences lead Chris Henze demonstrates the newly upgraded hyperwall visualization system to Ames center director Eugene Tu, deputy center director David Korsmeyer, and High-End Computing Capability manager William Thigpen.
NASA/Brandon Torres Navarette

In May, the NASA Advanced Supercomputing (NAS) facility, located at NASA’s Ames Research Center in California’s Silicon Valley, celebrated the newest generation of its hyperwall system, a wall of LCD screens that display supercomputer-scale visualizations of the very large datasets produced by NASA supercomputers and instruments. 

The upgrade is the fourth generation of hyperwall clusters at NAS. The LCD panels provide four times the resolution of the previous system, now spanning across a 300-square foot display with over a billion pixels. The hyperwall is one of the largest and most powerful visualization systems in the world. 

Systems like the NAS hyperwall can help researchers visualize their data at large scale, across different viewpoints or using different parameters for new ways of analysis. The improved resolution of the new system will help researchers “zoom in” with greater detail. 

The hyperwall is just one way researchers can utilize NASA’s high-end computing technology to better understand their data. The NAS facility offers world-class supercomputing resources and services customized to meet the needs of about 1,500 users from NASA centers, academia and industry. 

Share

Details

Last Updated
Jul 01, 2024

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      4 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      A lot can change in a year for Earth’s forests and vegetation, as springtime and rainy seasons can bring new growth, while cooling temperatures and dry weather can bring a dieback of those green colors. And now, a novel type of NASA visualization illustrates those changes in a full complement of colors as seen from space.
      Researchers have now gathered a complete year of PACE data to tell a story about the health of land vegetation by detecting slight variations in leaf colors. Previous missions allowed scientists to observe broad changes in chlorophyll, the pigment that gives plants their green color and also allows them to perform photosynthesis. But PACE now allows scientists to see three different pigments in vegetation: chlorophyll, anthocyanins, and carotenoids. The combination of these three pigments helps scientists pinpoint even more information about plant health. Credit: NASA’s Goddard Space Flight Center NASA’s Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) satellite is designed to view Earth’s microscopic ocean plants in a new lens, but researchers have proved its hyperspectral use over land, as well.
      Previous missions measured broad changes in chlorophyll, the pigment that gives plants their green color and also allows them to perform photosynthesis. Now, for the first time, PACE measurements have allowed NASA scientists and visualizers to show a complete year of global vegetation data using three pigments: chlorophyll, anthocyanins, and carotenoids. That multicolor imagery tells a clearer story about the health of land vegetation by detecting the smallest of variations in leaf colors.
      “Earth is amazing. It’s humbling, being able to see life pulsing in colors across the whole globe,” said Morgaine McKibben, PACE applications lead at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “It’s like the overview effect that astronauts describe when they look down at Earth, except we are looking through our technology and data.”
      Anthocyanins, carotenoids, and chlorophyll data light up North America, highlighting vegetation and its health.Credit: NASA’s Scientific Visualization Studio Anthocyanins are the red pigments in leaves, while carotenoids are the yellow pigments – both of which we see when autumn changes the colors of trees. Plants use these pigments to protect themselves from fluctuations in the weather, adapting to the environment through chemical changes in their leaves. For example, leaves can turn more yellow when they have too much sunlight but not enough of the other necessities, like water and nutrients. If they didn’t adjust their color, it would damage the mechanisms they have to perform photosynthesis.
      In the visualization, the data is highlighted in bright colors: magenta represents anthocyanins, green represents chlorophyll, and cyan represents carotenoids. The brighter the colors are, the more leaves there are in that area. The movement of these colors across the land areas show the seasonal changes over time.
      In areas like the evergreen forests of the Pacific Northwest, plants undergo less seasonal change. The data highlights this, showing comparatively steadier colors as the year progresses.
      The combination of these three pigments helps scientists pinpoint even more information about plant health.
      “Shifts in these pigments, as detected by PACE, give novel information that may better describe vegetation growth, or when vegetation changes from flourishing to stressed,” said McKibben. “It’s just one of many ways the mission will drive increased understanding of our home planet and enable innovative, practical solutions that serve society.”
      The Ocean Color Instrument on PACE collects hyperspectral data, which means it observes the planet in 100 different wavelengths of visible and near infrared light. It is the only instrument – in space or elsewhere – that provides hyperspectral coverage around the globe every one to two days. The PACE mission builds on the legacy of earlier missions, such as Landsat, which gathers higher resolution data but observes a fraction of those wavelengths.
      In a paper recently published in Remote Sensing Letters, scientists introduced the mission’s first terrestrial data products.
      “This PACE data provides a new view of Earth that will improve our understanding of ecosystem dynamics and function,” said Fred Huemmrich, research professor at the University of Maryland, Baltimore County, member of the PACE science and applications team, and first author of the paper. “With the PACE data, it’s like we’re looking at a whole new world of color. It allows us to describe pigment characteristics at the leaf level that we weren’t able to do before.”
      As scientists continue to work with these new data, available on the PACE website, they’ll be able to incorporate it into future science applications, which may include forest monitoring or early detection of drought effects.
      By Erica McNamee
      NASA’s Goddard Space Flight Center, Greenbelt, Md.
      Share
      Details
      Last Updated Jun 05, 2025 EditorKate D. RamsayerContactKate D. Ramsayerkate.d.ramsayer@nasa.gov Related Terms
      Earth Goddard Space Flight Center PACE (Plankton, Aerosol, Cloud, Ocean Ecosystem) Explore More
      4 min read Tundra Vegetation to Grow Taller, Greener Through 2100, NASA Study Finds
      Article 10 months ago 8 min read NASA Researchers Study Coastal Wetlands, Champions of Carbon Capture
      In the Florida Everglades, NASA’s BlueFlux Campaign investigates the relationship between tropical wetlands and greenhouse…
      Article 3 months ago 5 min read NASA Takes to the Air to Study Wildflowers
      Article 2 months ago View the full article
    • By NASA
      5 Min Read 3 Black Holes Caught Eating Massive Stars in NASA Data
      A disk of hot gas swirls around a black hole in this illustration. Some of the gas came from a star that was pulled apart by the black hole, forming the long stream of hot gas on the right, feeding into the disk. Credits:
      NASA/JPL-Caltech Black holes are invisible to us unless they interact with something else. Some continuously eat gas and dust, and appear to glow brightly over time as matter falls in. But other black holes secretly lie in wait for years until a star comes close enough to snack on.
      Scientists have recently identified three supermassive black holes at the centers of distant galaxies, each of which suddenly brightened when it destroyed a star and then stayed bright for several months. A new study using space and ground-based data from NASA, ESA (European Space Agency), and other institutions presents these rare occurrences as a new category of cosmic events called “extreme nuclear transients.”
      Looking for more of these extreme nuclear transients could help unveil some of the most massive supermassive black holes in the universe that are usually quiet.
      “These events are the only way we can have a spotlight that we can shine on otherwise inactive massive black holes,” said Jason Hinkle, graduate student at the University of Hawaii and lead author of a new study in the journal Science Advances describing this phenomenon.
      The black holes in question seem to have eaten stars three to 10 times heavier than our Sun. Feasting on the stars resulted in some of the most energetic transient events ever recorded.
      This illustration shows a glowing stream of material from a star as it is being devoured by a supermassive black hole. When a star passes within a certain distance of a black hole — close enough to be gravitationally disrupted — the stellar material gets stretched and compressed as it falls into the black hole. NASA/JPL-Caltech These events as unleash enormous amount of high-energy radiation on the central regions of their host galaxies. “That has implications for the environments in which these events are occurring,” Hinkle said. “If galaxies have these events, they’re important for the galaxies themselves.”
      The stars’ destruction produces high-energy light that takes over 100 days to reach peak brightness, then more than 150 days to dim to half of its peak. The way the high-energy radiation affects the environment results in lower-energy emissions that telescopes can also detect.
      One of these star-destroying events, nicknamed “Barbie” because of its catalog identifier ZTF20abrbeie, was discovered in 2020 by the Zwicky Transient Facility at Caltech’s Palomar Observatory in California, and documented in two 2023 studies. The other two black holes were detected by ESA’s Gaia mission in 2016 and 2018 and are studied in detail in the new paper.
      NASA’s Neil Gehrels Swift Observatory was critical in confirming that these events must have been related to black holes, not stellar explosions or other phenomena.  The way that the X-ray, ultraviolet, and optical light brightened and dimmed over time was like a fingerprint matching that of a black hole ripping a star apart.
      Scientists also used data from NASA’s WISE spacecraft, which was operated from 2009 to 2011 and then was reactivated as NEOWISE and retired in 2024. Under the WISE mission the spacecraft mapped the sky at infrared wavelengths, finding many new distant objects and cosmic phenomena. In the new study, the spacecraft’s data helped researchers characterize dust in the environments of each black hole. Numerous ground-based observatories additionally contributed to this discovery, including the W. M. Keck Observatory telescopes through their NASA-funded archive and the NASA-supported Near-Earth Object surveys ATLAS, Pan-STARRS, and Catalina.
      “What I think is so exciting about this work is that we’re pushing the upper bounds of what we understand to be the most energetic environments of the universe,” said Anna Payne, a staff scientist at the Space Telescope Science Institute and study co-author, who helped look for the chemical fingerprints of these events with the University of Hawaii 2.2-meter Telescope.
      A Future Investigators in NASA Earth and Space Science and Technology (FINESST) grant from the agency helped enable Hinkle to search for these black hole events. “The FINESST grant gave Jason the freedom to track down and figure out what these events actually were,” said Ben Shappee, associate professor at the Institute for Astronomy at the University of Hawaii, a study coauthor and advisor to Hinkle.
      Hinkle is set to follow up on these results as a postdoctoral fellow at the University of Illinois Urbana-Champaign through the NASA Hubble Fellowship Program. “One of the biggest questions in astronomy is how black holes grow throughout the universe,” Hinkle said.
      The results complement recent observations from NASA’s James Webb Space Telescope showing how supermassive black holes feed and grow in the early universe. But since only 10% of early black holes are actively eating gas and dust, extreme nuclear transients — that is, catching a supermassive black hole in the act of eating a massive star — are a different way to find black holes in the early universe.
      Events like these are so bright that they may be visible even in the distant, early universe. Swift showed that extreme nuclear transients emit most of their light in the ultraviolet. But as the universe expands, that light is stretched to longer wavelengths and shifts into the infrared — exactly the kind of light NASA’s upcoming Nancy Grace Roman Space Telescope was designed to detect.
      With its powerful infrared sensitivity and wide field of view, Roman will be able to spot these rare explosions from more than 12 billion years ago, when the universe was just a tenth of its current age. Scheduled to launch by 2027, and potentially as early as fall 2026, Roman could uncover many more of these dramatic events and offer a new way to explore how stars, galaxies, and black holes formed and evolved over time.
      “We can take these three objects as a blueprint to know what to look for in the future,” Payne said.
      Explore More
      5 min read NASA’s Webb Rounds Out Picture of Sombrero Galaxy’s Disk


      Article


      1 day ago
      2 min read Hubble Filters a Barred Spiral


      Article


      1 day ago
      5 min read Apocalypse When? Hubble Casts Doubt on Certainty of Galactic Collision


      Article


      2 days ago
      View the full article
    • By European Space Agency
      From its vantage point outside Earth’s atmosphere, more than 36 000 km above Earth’s surface, the Copernicus Sentinel-4 mission will detect major air pollutants over Europe in unprecedented detail. It will observe how they vary on an hourly basis – a real breakthrough for air quality forecasting.
      View the full article
    • By NASA
      6 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Sunlight reflects off the ocean surface near Norfolk, Virginia, in this 1991 space shuttle image, highlighting swirling patterns created by features such as internal waves, which are produced when the tide moves over underwater features. Data from the international SWOT mission is revealing the role of smaller-scale waves and eddies.NASA The international mission collects two-dimensional views of smaller waves and currents that are bringing into focus the ocean’s role in supporting life on Earth.
      Small things matter, at least when it comes to ocean features like waves and eddies. A recent NASA-led analysis using data from the SWOT (Surface Water and Ocean Topography) satellite found that ocean features as small as a mile across potentially have a larger impact on the movement of nutrients and heat in marine ecosystems than previously thought.
      Too small to see well with previous satellites but too large to see in their entirety with ship-based instruments, these relatively small ocean features fall into a category known as the submesoscale. The SWOT satellite, a joint effort between NASA and the French space agency CNES (Centre National d’Études Spatiales), can observe these features and is demonstrating just how important they are, driving much of the vertical transport of things like nutrients, carbon, energy, and heat within the ocean. They also influence the exchange of gases and energy between the ocean and atmosphere.
      “The role that submesoscale features play in ocean dynamics is what makes them important,” said Matthew Archer, an oceanographer at NASA’s Jet Propulsion Laboratory in Southern California. Some of these features are called out in the animation below, which was created using SWOT sea surface height data.

      This animation shows small ocean features — including internal waves and eddies — derived from SWOT observations in the Indian, Atlantic, and Pacific oceans, as well as the Mediterranean Sea. White and lighter blue represent higher ocean surface heights compared to darker blue areas. The purple colors shown in one location represent ocean current speeds.
      NASA’s Scientific Visualization Studio “Vertical currents move heat between the atmosphere and ocean, and in submesoscale eddies, can actually bring up heat from the deep ocean to the surface, warming the atmosphere,” added Archer, who is a coauthor on the submesoscale analysis published in April in the journal Nature. Vertical circulation can also bring up nutrients from the deep sea, supplying marine food webs in surface waters like a steady stream of food trucks supplying festivalgoers.
      “Not only can we see the surface of the ocean at 10 times the resolution of before, we can also infer how water and materials are moving at depth,” said Nadya Vinogradova Shiffer, SWOT program scientist at NASA Headquarters in Washington.
      Fundamental Force
      Researchers have known about these smaller eddies, or circular currents, and waves for decades. From space, Apollo astronauts first spotted sunlight glinting off small-scale eddies about 50 years ago. And through the years, satellites have captured images of submesoscale ocean features, providing limited information such as their presence and size. Ship-based sensors or instruments dropped into the ocean have yielded a more detailed view of submesoscale features, but only for relatively small areas of the ocean and for short periods of time.
      The SWOT satellite measures the height of water on nearly all of Earth’s surface, including the ocean and freshwater bodies, at least once every 21 days. The satellite gives researchers a multidimensional view of water levels, which they can use to calculate, for instance, the slope of a wave or eddy. This in turn yields information on the amount of pressure, or force, being applied to the water in the feature. From there, researchers can figure out how fast a current is moving, what’s driving it and —combined with other types of information — how much energy, heat, or nutrients those currents are transporting.  
      “Force is the fundamental quantity driving fluid motion,” said study coauthor Jinbo Wang, an oceanographer at Texas A&M University in College Station. Once that quantity is known, a researcher can better understand how the ocean interacts with the atmosphere, as well as how changes in one affect the other.
      Prime Numbers
      Not only was SWOT able to spot a submesoscale eddy in an offshoot of the Kuroshio Current — a major current in the western Pacific Ocean that flows past the southeast coast of Japan — but researchers were also able to estimate the speed of the vertical circulation within that eddy. When SWOT observed the feature, the vertical circulation was likely 20 to 45 feet (6 to 14 meters) per day.
      This is a comparatively small amount for vertical transport. However, the ability to make those calculations for eddies around the world, made possible by SWOT, will improve researchers’ understanding of how much energy, heat, and nutrients move between surface waters and the deep sea.
      Researchers can do similar calculations for such submesoscale features as an internal solitary wave — a wave driven by forces like the tide sloshing over an underwater plateau. The SWOT satellite spotted an internal wave in the Andaman Sea, located in the northeastern part of the Indian Ocean off Myanmar. Archer and colleagues calculated that the energy contained in that solitary wave was at least twice the amount of energy in a typical internal tide in that region.
      This kind of information from SWOT helps researchers refine their models of ocean circulation. A lot of ocean models were trained to show large features, like eddies hundreds of miles across, said Lee Fu, SWOT project scientist at JPL and a study coauthor. “Now they have to learn to model these smaller scale features. That’s what SWOT data is helping with.”
      Researchers have already started to incorporate SWOT ocean data into some models, including NASA’s ECCO (Estimating the Circulation and Climate of the Ocean). It may take some time until SWOT data is fully a part of models like ECCO. But once it is, the information will help researchers better understand how the ocean ecosystem will react to a changing world.
      More About SWOT
      The SWOT satellite was jointly developed by NASA and CNES, with contributions from the Canadian Space Agency (CSA) and the UK Space Agency. Managed for NASA by Caltech in Pasadena, California, JPL leads the U.S. component of the project. For the flight system payload, NASA provided the Ka-band radar interferometer (KaRIn) instrument, a GPS science receiver, a laser retroreflector, a two-beam microwave radiometer, and NASA instrument operations. The Doppler Orbitography and Radioposition Integrated by Satellite system, the dual frequency Poseidon altimeter (developed by Thales Alenia Space), the KaRIn radio-frequency subsystem (together with Thales Alenia Space and with support from the UK Space Agency), the satellite platform, and ground operations were provided by CNES. The KaRIn high-power transmitter assembly was provided by CSA.
      To learn more about SWOT, visit:
      https://swot.jpl.nasa.gov
      News Media Contacts
      Jane J. Lee / Andrew Wang
      Jet Propulsion Laboratory, Pasadena, Calif.
      626-491-1943 / 626-379-6874
      jane.j.lee@jpl.nasa.gov / andrew.wang@jpl.nasa.gov
      2025-070
      Share
      Details
      Last Updated May 15, 2025 Related Terms
      SWOT (Surface Water and Ocean Topography) Jet Propulsion Laboratory Oceanography Oceans Explore More
      6 min read NASA’s Magellan Mission Reveals Possible Tectonic Activity on Venus
      Article 23 hours ago 6 min read NASA Studies Reveal Hidden Secrets About Interiors of Moon, Vesta
      Article 1 day ago 5 min read NASA’s Europa Clipper Captures Mars in Infrared
      Article 3 days ago Keep Exploring Discover Related Topics
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      5 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Editor’s Note: The following is one of three related articles about the NASA Data Acquisition System and related efforts. Please visit Stennis News – NASA to access accompanying articles.
      A blended team of NASA personnel and contractors support ongoing development and operation of the NASA Data Acquisition System at NASA’s Stennis Space Center. Team members include, left to right: Andrew Graves (NASA), Shane Cravens (Syncom Space Services), Peggi Marshall (Syncom Space Services), Nicholas Payton Karno (Syncom Space Services), Alex Elliot (NASA), Kris Mobbs (NASA), Brandon Carver (NASA), Richard Smith (Syncom Space Services), and David Carver (NASA)NASA/Danny Nowlin Members of the NASA Data Acquisition System team at NASA’s Stennis Space Center evaluate system hardware for use in monitoring and collecting propulsion test data at the site.NASA/Danny Nowlin NASA software engineer Alex Elliot, right, and Syncom Space Services software engineer Peggi Marshall fine-tune data acquisition equipment at NASA’s Stennis Space Center by adjusting an oscilloscope to capture precise measurements. NASA/Danny Nowlin Syncom Space Services software test engineer Nicholas Payton Karno monitors a lab console at NASA’s Stennis Space Center displaying video footage of an RS-25 engine gimbal test, alongside data acquisition screens showing lab measurements. NASA/Danny Nowlin Just as a steady heartbeat is critical to staying alive, propulsion test data is vital to ensure engines and systems perform flawlessly.
      The accuracy of the data produced during hot fire tests at NASA’s Stennis Space Center near Bay St. Louis, Mississippi, tells the performance story.
      So, when NASA needed a standardized way to collect hot fire data across test facilities, an onsite team created an adaptable software tool to do it.
      “The NASA Data Acquisition System (NDAS) developed at NASA Stennis is a forward-thinking solution,” said David Carver, acting chief of the Office of Test Data and Information Management. “It has unified NASA’s rocket propulsion testing under an adaptable software suite to meet needs with room for future expansion, both within NASA and potentially beyond.”
      Before NDAS, contractors conducting test projects used various proprietary tools to gather performance data, which made cross-collaboration difficult. NDAS takes a one-size-fits-all approach, providing NASA with its own system to ensure consistency.
      “Test teams in the past had to develop their own software tools, but now, they can focus on propulsion testing while the NDAS team focuses on developing the software that collects data,” said Carver.
      A more efficient workflow has followed since the software system is designed to work with any test hardware. It allows engineers to seamlessly work between test areas, even when upgrades have been made and hardware has changed, to support hot fire requirements for the agency and commercial customers.
      With the backing and resources of the NASA Rocket Propulsion Test (RPT) Program Office, a blended team of NASA personnel and contractors began developing NDAS in 2011 as part of the agency’s move to resume control of test operations at NASA Stennis. Commercial entities had conducted the operations on NASA’s behalf for several decades.
      The NASA Stennis team wrote the NDAS software code with modular components that function independently and can be updated to meet the needs of each test facility. The team used LabVIEW, a graphical platform that allows developers to build software visually rather than using traditional text-based code.
      Syncom Space Services software engineer Richard Smith, front, analyzes test results using the NASA Data Acquisition System Displays interface at NASA’s Stennis Space Center while NASA software engineer Brandon Carver actively tests and develops laboratory equipment. NASA/Danny Nowlin NASA engineers, from left to right, Tristan Mooney, Steven Helmstetter Chase Aubry, and Christoffer Barnett-Woods are shown in the E-1 Test Control Center where the NASA Data Acquisition System is utilized for propulsion test activities. NASA/Danny Nowlin NASA engineers Steven Helmstetter, Christoffer Barnett-Woods, and Tristan Mooney perform checkouts on a large data acquisition system for the E-1 Test Stand at NASA’s Stennis Space Center. The data acquisition hardware, which supports testing for E Test Complex commercial customers, is controlled by NASA Data Acquisition System software that allows engineers to view real-time data while troubleshooting hardware configuration.NASA/Danny Nowlin NASA engineers Steven Helmstetter, left, and Tristan Mooney work with the NASA Data Acquisition System in the E-1 Test Control Center, where the system is utilized for propulsion test activities.NASA/Danny Nowlin “These were very good decisions by the original team looking toward the future,” said Joe Lacher, a previous NASA project manager. “LabVIEW was a new language and is now taught in colleges and widely used in industry. Making the program modular made it adaptable.”
      During propulsion tests, the NDAS system captures both high-speed and low-speed sensor data. The raw sensor data is converted into units for both real-time monitoring and post-test analysis.
      During non-test operations, the system monitors the facility and test article systems to help ensure the general health and safety of the facility and personnel.
      “Having quality software for instrumentation and data recording systems is critical and, in recent years, has become increasingly important,” said Tristan Mooney, NASA instrumentation engineer. “Long ago, the systems used less software, or even none at all. Amplifiers were configured with physical knobs, and data was recorded on tape or paper charts. Today, we use computers to configure, display, and store data for nearly everything.”
      Developers demonstrated the new system on the A-2 Test Stand in 2014 for the J-2X engine test project.
      From there, the team rolled it out on the Fred Haise Test Stand (formerly A-1), where it has been used for RS-25 engine testing since 2015. A year later, teams used NDAS on the Thad Cochran Test Stand (formerly B-2) in 2016 to support SLS (Space Launch System) Green Run testing for future Artemis missions.
      One of the project goals for the system is to provide a common user experience to drive consistency across test complexes and centers.
      Kris Mobbs, current NASA project manager for NDAS, said the system “really shined” during the core stage testing. “We ran 24-hour shifts, so we had people from across the test complex working on Green Run,” Mobbs said. “When the different shifts came to work, there was not a big transition needed. Using the software for troubleshooting, getting access to views, and seeing the measurements were very common activities, so the various teams did not have a lot of build-up time to support that test.”
      Following success at the larger test stands, teams started using NDAS in the E Test Complex in 2017, first at the E-2 Test Stand, then on the E-1 and E-3 stands in 2020.
      Growth of the project was “a little overwhelming,” Lacher recalled. The team maintained the software on active stands supporting tests, while also continuing to develop the software for other areas and their many unique requirements.
      Each request for change had to be tracked, implemented into the code, tested in the lab, then deployed and validated on the test stands.
      “This confluence of requirements tested my knowledge of every stand and its uniqueness,” said Lacher. “I had to understand the need, the effort to meet it, and then had to make decisions as to the priorities the team would work on first.”
      Creation of the data system and its ongoing updates have transformed into opportunities for growth among the NASA Stennis teams working together.
      “From a mechanical test operations perspective, NDAS has been a pretty easy system to learn,” said Derek Zacher, NASA test operations engineer. “The developers are responsive to the team’s ideas for improvement, and our experience has consistently improved with the changes that enable us to view our data in new ways.”
      Originally designed to support the RPT office at NASA Stennis, the software is expanding beyond south Mississippi to other test centers, attracting interest from various NASA programs and projects, and garnering attention from government agencies that require reliable and scalable data acquisition. “It can be adopted nearly anywhere, such as aerospace and defense, research and development institutions and more places, where data acquisition systems are needed,” said Mobbs. “It is an ever-evolving solution.”
      Read More Share
      Details
      Last Updated May 08, 2025 EditorNASA Stennis CommunicationsContactC. Lacy Thompsoncalvin.l.thompson@nasa.gov / (228) 688-3333LocationStennis Space Center Related Terms
      Stennis Space Center View the full article
  • Check out these Videos

×
×
  • Create New...