Jump to content

Understanding Risk, Artificial Intelligence, and Improving Software Quality


Recommended Posts

  • Publishers
Posted

The software discipline has broad involvement across each of the NASA Mission Directorates. Some recent discipline focus and development areas are highlighted below, along with a look at the Software Technical Discipline Team’s (TDT) approach to evolving discipline best practices toward the future.

Understanding Automation Risk

Software creates automation. Reliance on that automation is increasing the amount of software in NASA programs. This year, the software team examined historical software incidents in aerospace to characterize how, why, and where software or automation is mostly likely to fail. The goal is to better engineer software to minimize the risk of errors, improve software processes, and better architect software for resilience to errors (or improve fault-tolerance should errors occur).

techup2023-pg50-51-art1.png

Some key findings shown in the above charts, indicate that software more often does the wrong thing rather than just crash. Rebooting was found to be ineffective when software behaves erroneously. Unexpected behavior was mostly attributed to the code or logic itself, and about half of those instances were the result of missing software—software not present due to unanticipated situations or missing requirements. This may indicate that even fully tested software is exposed to this significant class of error. Data misconfiguration was a sizeable factor that continues to grow with the advent of more modern data-driven systems. A final subjective category assessed was “unknown unknowns”—things that could not have been reasonably anticipated. These accounted for 19% of software incidents studied.

The software team is using and sharing these findings to improve best practices. More emphasis is being placed on the importance of complete requirements, off-nominal test campaigns, and “test as you fly” using real hardware in the loop. When designing systems for fault tolerance, more consideration should be given to detecting and correcting for erroneous behavior versus just checking for a crash. Less confidence should be placed on rebooting as an effective recovery strategy. Backup strategies for automations should be employed for critical applications—considering the historic prevalence of absent software and unknown unknowns. More information can be found in NASA/TP-20230012154, Software Error Incident Categorizations in Aerospace.

Employing AI and Machine Learning Techniques

The rise of artificial intelligence (AI) and machine learning (ML) techniques has allowed NASA to examine data in new ways that were not previously possible. While NASA has been employing autonomy since its inception, AI/ML techniques provide teams the ability to expand the use of autonomy outside of previous bounds. The Agency has been working on AI ethics frameworks and examining standards, procedures, and practices, taking security implications into account. While AI/ML generally uses nondeterministic statistical algorithms that currently limit its use in safety-critical flight applications, it is used by NASA in more than 400 AI/ML projects aiding research and science. The Agency also uses AI/ML Communities of Practice for sharing knowledge across the centers. The TDT surveyed AI/ML work across the Agency and summarized it for trends and lessons.

Common usages of AI/ML include image recognition and identification. NASA Earth science missions use AI/ML to identify marine debris, measure cloud thickness, and identify wildfire smoke (examples are shown in the satellite images below). This reduces the workload on personnel. There are many applications of AI/ML being used to predict atmospheric physics. One example is hurricane track and intensity prediction. Another example is predicting planetary boundary layer thickness and comparing it against measurements, and those predictions are being fused with live data to improve the performance over previous boundary layer models.

techup2023-pg50-51-art2.png?w=1815
Examples of how NASA uses AI/ML. Satellite images of clouds with estimation of cloud thickness (left) and wildfire detection (right).
techup2023-pg50-51-art3.png?w=2048
NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov)

The Code Analysis Pipeline: Static Analysis Tool for IV&V and Software Quality Improvement

The Code Analysis Pipeline (CAP) is an open-source tool architecture that supports software development and assurance activities, improving overall software quality. The Independent Verification and Validation (IV&V) Program is using CAP to support software assurance on the Human Landing System, Gateway, Exploration Ground Systems, Orion, and Roman. CAP supports the configuration and automated execution of multiple static code analysis tools to identify potential code defects, generate code metrics that indicate potential areas of quality concern (e.g., cyclomatic complexity), and execute any other tool that analyzes or processes source code. The TDT is focused on integrating Modified Condition/Decision Coverage analysis support for coverage testing. Results from tools are consolidated into a central database and presented in context through a user interface that supports review, query, reporting, and analysis of results as the code matures.

The tool architecture is based on an industry standard DevOps approach for continuous building of source code and running of tools. CAP integrates with GitHub for source code control, uses Jenkins to support automation of analysis builds, and leverages Docker to create standard and custom build environments that support unique mission needs and use cases.

Improving Software Process & Sharing Best Practices

The TDT has captured the best practice knowledge from across the centers in NPR 7150.2, NASA Software Engineering Requirements, and NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov.) Two APPEL training classes have been developed and shared with several organizations to give them the foundations in the NPR and software engineering management. The TDT established several subteams to help programs/projects as they tackle software architecture, project management, requirements, cybersecurity, testing and verification, and programmable logic controllers. Many of these teams have developed guidance and best practices, which are documented in NASA-HDBK-2203 and on the NASA Engineering Network.

NPR 7150.2 and the handbook outline best practices over the full lifecycle for all NASA software. This includes requirements development, architecture, design, implementation, and verification. Also covered, and equally important, are the supporting activities/functions that improve quality, including software assurance, safety configuration management, reuse, and software acquisition. Rationale and guidance for the requirements are addressed in the handbook that is internally and externally accessible and regularly updated as new information, tools, and techniques are found and used.

The Software TDT deputies train software engineers, systems engineers, chief engineers, and project managers on the NPR requirements and their role in ensuring these requirements are implemented across NASA centers. Additionally, the TDT deputies train software technical leads on many of the advanced management aspects of a software engineering effort, including planning, cost estimating, negotiating, and handling change management.

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      Credit: NASA NASA has awarded a contract to MacLean Engineering & Applied Technologies, LLC of Houston to provide simulation and advanced software services to the agency.
      The Simulation and Advanced Software Services II (SASS II) contract includes services from Oct. 1, 2025, through Sept. 30, 2030, with a maximum potential value not to exceed $150 million. The contract is a single award, indefinite-delivery/indefinite-quality contract with the capability to issue cost-plus-fixed-fee task orders and firm-fixed-price task orders.
      Under the five-year SASS II contract, the awardee is tasked to provide simulation and software services for space-based vehicle models and robotic manipulator systems; human biomechanical representations for analysis and development of countermeasures devices; guidance, navigation, and control of space-based vehicles for all flight phases; and space-based vehicle on-board computer systems simulations of flight software systems. Responsibilities also include astronomical object surface interaction simulation of space-based vehicles, graphics support for simulation visualization and engineering analysis, and ground-based and onboarding systems to support human-in-the-loop training.
      Major subcontractors include Tietronix Software Inc. in Houston and VEDO Systems, LLC, in League City, Texas.
      For information about NASA and agency programs, visit:
      https://www.nasa.gov/
      -end-
      Tiernan Doyle
      Headquarters, Washington
      202-358-1600
      tiernan.doyle@nasa.gov
      Chelsey Ballarte
      Johnson Space Center, Houston
      281-483-5111
      Chelsey.n.ballarte@nasa.gov
      Share
      Details
      Last Updated Jul 02, 2025 LocationNASA Headquarters Related Terms
      Technology Johnson Space Center View the full article
    • By European Space Agency
      Today, the European Space Agency’s Proba-3 mission unveils its first images of the Sun’s outer atmosphere – the solar corona. The mission’s two satellites, able to fly as a single spacecraft thanks to a suite of onboard positioning technologies, have succeeded in creating their first ‘artificial total solar eclipse’ in orbit. The resulting coronal images demonstrate the potential of formation flying technologies, while delivering invaluable scientific data that will improve our understanding of the Sun and its enigmatic atmosphere.
      View the full article
    • By European Space Agency
      Video: 00:01:40 Proba-3 artificially created what is normally a rare natural phenomenon: a total solar eclipse.
      In a world first, ESA’s Proba-3 satellites flew in perfect formation, blocking the Sun’s bright disc to reveal its fiery corona. This enigmatic outer layer burns millions of degrees hotter than the Sun’s surface and drives the solar storms that can disrupt life on Earth.
      With its first artificial eclipse, Proba-3 has captured detailed images of this mysterious region, offering scientists new insights into our star’s behaviour.
      Read the full story here.
      Access the related broadcast qality footage. 
      View the full article
    • By NASA
      3 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      When most people think of NASA, they picture rockets, astronauts, and the Moon. But behind the scenes, a group of inventors is quietly rewriting the rules of what’s possible — on Earth, in orbit, and beyond. Their groundbreaking inventions eventually become technology available for industry, helping to shape new products and services that improve life around the globe. For their contributions to NASA technology, we welcome four new inductees into the 2024-2025 NASA Inventors Hall of Fame

      A robot for space and the workplace

      Myron (Ron) Diftler led the team behind Robonaut 2 (R2), a humanoid robot developed with General Motors. The goal was to create a robot that could help humans both in space and on the factory floor. The R2 robot became the first humanoid robot in space aboard the International Space Station, and part of its technology was licensed for use on Earth, leading to a grip-strengthening robotic glove to help humans with strenuous, repetitive tasks. From factories to space exploration, Diftler’s work has real-world impact. 

      Some of the toughest electronic chips on and off Earth

      Technology developed to one day explore the surface of Venus has to be tough enough to survive the planet where temperatures hit 860°F and the atmosphere is akin to battery acid. Philip Neudeck’s silicon carbide integrated circuits don’t just work — they ran for over 60 days in simulated Venus-like conditions. On Earth, these chips can boost efficiency in wireless communication systems, help make drilling for oil safer, and enable more practical electric vehicles. 
      From developing harder chip materials to unlocking new planetary missions, Neudeck is proving that the future of electronics isn’t just about speed — it’s about survival.

      Hydrogen sensors that could go the distance on other worlds

      Gary Hunter helped develop a hydrogen sensor so advanced it’s being considered for a future mission to Titan, Saturn’s icy moon. These and a range of other sensors he’s helped developed have applications that go beyond space exploration, such as factory floors here on Earth.
      With new missions on the horizon and smarter sensors in development, Hunter is still pushing the boundaries of what NASA technology can do. Whether it’s Titan, the surface of Venus, or somewhere we haven’t dreamed of yet, this work could help shape the way to get there. 

      Advanced materials research to make travel safer

      Advanced materials, such as foams and composites, are key to unlocking the next generation of manufacturing. From space exploration to industry, Erik Weiser spent years contributing his expertise to the development of polymers, ceramics, metals, nanomaterials, and more. He is named on more than 20 patents. During this time, he provided his foam expertise to the Space Shuttle Columbia accident investigation, the Shuttle Discovery Return-to-Flight Investigation and numerous teams geared toward improving the safety of the shuttle.  
      Today, Weiser serves as director of the Facilities and Real Estate Division at NASA Headquarters, overseeing the foundation of NASA’s missions. Whether it’s advancing research or optimizing real estate across the agency, he’s helping launch the future, one facility at a time.

      Want to learn more about NASA’s game changing innovations? Visit the NASA Inventors Hall of Fame.
      Read More Share
      Details
      Last Updated May 09, 2025 Related Terms
      Technology Technology Transfer Technology Transfer & Spinoffs Explore More
      3 min read Key Portion of NASA’s Roman Space Telescope Clears Thermal Vacuum Test
      Article 2 days ago 4 min read NASA Enables SPHEREx Data Return Through Commercial Partnership
      Article 3 days ago 6 min read NASA Data Helps Map Tiny Plankton That Feed Giant Right Whales
      In the waters off New England, one of Earth’s rarest mammals swims slowly, mouth agape.…
      Article 4 days ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By NASA
      Editor’s Note: The following is one of three related articles about the NASA Data Acquisition System and related efforts. Please visit Stennis News – NASA to access accompanying articles.
      The NASA Data Acquisition System, developed at NASA Stennis, is used in multiple test areas at NASA’s Marshall Space Flight Center in Huntsville, Alabama, including Test Facility 116. The facility consists of an open-steel test stand structure, primarily used for subscale testing, and three adjacent test bays designed for large-scale/full-scale testing. NASA/Marshall Space Flight Center Teams at NASA’s Langley Research Center in Hampton, Virginia conduct a test in the 8-Foot High-Temperature Tunnel. The NASA Data Acquisition System, developed at NASA Stennis, represents a potential solution for engineers seeking to standardize data systems at NASA Langley. NASA/Langley Research Center Teams at Test Stand 403, located at NASA’s White Sands Test Facility in Las Cruces, New Mexico, plan to use the NASA Data Acquisition System to support testing and development projects related to NASA’s Orion spacecraft.NASA/White Sands Test Facility A data-focused software tool created at NASA’s Stennis Space Center near Bay St. Louis, Mississippi, continues to expand its capabilities and use across the agency.
      Much like the software on a cell phone, the NASA Data Acquisition System (NDAS) software evolves with updates to meet user needs.
      “It is not just because we are seeking new opportunities that we evolve,” said Kris Mobbs, NASA project manager for NDAS. “It is because the community of people using this software tell us about all the new, cool things happening and how they want to use the tool.”
      Created as a standard method for collecting rocket propulsion test data, NDAS is proving to be a building block to acquire, display, and process various datasets. The flexibility of the software has supplied solutions for NASA’s work in New Mexico and Alabama and is being evaluated for data acquisition needs in Virginia.
      When NASA’s White Sands Test Facility in Las Cruces, New Mexico, needed a new data acquisition system with a flexible design, the facility reached out to NASA Stennis since the center had demonstrated success with a similar challenge.
      “A major benefit for the agency is having a software platform that is agency owned and developed,” said Josh Simmons, White Sands technical upgrades lead. “Stennis is leading the way and the way the system is written and documented, other programmers can jump in, and the way they have it designed, it can continue on and that is key.”
      The NASA Stennis team updated its NDAS platform based on input from White Sands personnel to make it more adaptable and to increase data acquisition rates.
      “They look to understand the requirements and to develop an application that is flexible to meet everybody’s requirements,” Simmons said. “They are always willing to improve it, to make it more applicable to a wider audience.”
      NDAS will be the primary data acquisition and control systems to support testing and development projects related to NASA’s Orion spacecraft.
      “I would like to standardize around it here at White Sands,” said Simmons. “I want to show the worth and versatility of NDAS, so people who need it make a choice to use it.”
      Meanwhile at NASA’s Marshall Space Flight Center in Huntsville, Alabama, NDAS is used in multiple areas for small-scale, subscale, and full-scale testing.
      Devin Rios Ogle is a contractor software engineer at NASA Marshall, responsible for integrating and upgrading the data acquisition system in the testing areas. The system is used to record data on test sequences to verify they happen as intended.
      “The visualization of data is really nice compared to other software I have worked with,” said Rios Ogle. “It is easier to see what data you want to see when you want to see it. You select a measurement, and you can see it in graph form, or tabular form, or however you would like. It is visually appealing and very easy to find the stuff you need.”
      Rios Ogle is familiar with the database behind the system and understands what the program is trying to do. He particularly noted the modular approach built into the system, which allows users to adapt the software as needed and is a feature others would find beneficial.
      Marcus Jackson, a contractor instrumentation and control engineer at NASA Marshall, echoed Ogle’s assessment of NDAS, noting that it has allowed the center to condense multiple systems into a single package that meets the team’s unique needs.
      “Ultimately, NDAS provides us with an excellent software package that is built specifically for the kind of work performed here and at other test stands across the United States,” said Jackson. “It is easy to install, manage, and scale up. It doesn’t break, but if you do find a bug or issue, the NDAS team is very quick to respond and help you find a solution.”
      NDAS also represents a potential solution for engineers seeking to standardize data systems at NASA’s Langley Research Center in Hampton, Virginia, a use that could positively impact a mission’s ability to make data-informed decisions.
      “We are investigating alternatives for standardization at all Langley facilities,” said Scott Simmons, NASA Langley data systems engineer. “Standardization has the potential for significant maintenance cost savings and efficiencies because of the sharing of the software. Having an instance of NDAS available for the dynamic data system at the 8-Foot High Temperature tunnel enables us to evaluate it as a potential solution for standardization at Langley.”
      As the nation’s largest hypersonic blow-down test facility, the tunnel duplicates, as near as possible, flight conditions that would be encountered by hypersonic vehicles at up to Mach 6.5, or more than six times the speed of sound.
      Even as its use grows, the NASA Stennis-led software project continues to gain momentum as it expands its capabilities and collaboration with users.
      “The goal is to provide a software portfolio that supports a wide range of exciting NASA projects, involving lots of talented people that collaborate and innovate new software solutions far into the future,” Mobbs said. “This is a community of innovative, ambitious, and supportive engineers and scientists across all engineering disciplines that are dedicated to advancing NASA’s bold missions.”
      Read More Share
      Details
      Last Updated May 08, 2025 Related Terms
      Stennis Space Center View the full article
  • Check out these Videos

×
×
  • Create New...