Jump to content

Understanding Risk, Artificial Intelligence, and Improving Software Quality


NASA

Recommended Posts

  • Publishers

The software discipline has broad involvement across each of the NASA Mission Directorates. Some recent discipline focus and development areas are highlighted below, along with a look at the Software Technical Discipline Team’s (TDT) approach to evolving discipline best practices toward the future.

Understanding Automation Risk

Software creates automation. Reliance on that automation is increasing the amount of software in NASA programs. This year, the software team examined historical software incidents in aerospace to characterize how, why, and where software or automation is mostly likely to fail. The goal is to better engineer software to minimize the risk of errors, improve software processes, and better architect software for resilience to errors (or improve fault-tolerance should errors occur).

techup2023-pg50-51-art1.png

Some key findings shown in the above charts, indicate that software more often does the wrong thing rather than just crash. Rebooting was found to be ineffective when software behaves erroneously. Unexpected behavior was mostly attributed to the code or logic itself, and about half of those instances were the result of missing software—software not present due to unanticipated situations or missing requirements. This may indicate that even fully tested software is exposed to this significant class of error. Data misconfiguration was a sizeable factor that continues to grow with the advent of more modern data-driven systems. A final subjective category assessed was “unknown unknowns”—things that could not have been reasonably anticipated. These accounted for 19% of software incidents studied.

The software team is using and sharing these findings to improve best practices. More emphasis is being placed on the importance of complete requirements, off-nominal test campaigns, and “test as you fly” using real hardware in the loop. When designing systems for fault tolerance, more consideration should be given to detecting and correcting for erroneous behavior versus just checking for a crash. Less confidence should be placed on rebooting as an effective recovery strategy. Backup strategies for automations should be employed for critical applications—considering the historic prevalence of absent software and unknown unknowns. More information can be found in NASA/TP-20230012154, Software Error Incident Categorizations in Aerospace.

Employing AI and Machine Learning Techniques

The rise of artificial intelligence (AI) and machine learning (ML) techniques has allowed NASA to examine data in new ways that were not previously possible. While NASA has been employing autonomy since its inception, AI/ML techniques provide teams the ability to expand the use of autonomy outside of previous bounds. The Agency has been working on AI ethics frameworks and examining standards, procedures, and practices, taking security implications into account. While AI/ML generally uses nondeterministic statistical algorithms that currently limit its use in safety-critical flight applications, it is used by NASA in more than 400 AI/ML projects aiding research and science. The Agency also uses AI/ML Communities of Practice for sharing knowledge across the centers. The TDT surveyed AI/ML work across the Agency and summarized it for trends and lessons.

Common usages of AI/ML include image recognition and identification. NASA Earth science missions use AI/ML to identify marine debris, measure cloud thickness, and identify wildfire smoke (examples are shown in the satellite images below). This reduces the workload on personnel. There are many applications of AI/ML being used to predict atmospheric physics. One example is hurricane track and intensity prediction. Another example is predicting planetary boundary layer thickness and comparing it against measurements, and those predictions are being fused with live data to improve the performance over previous boundary layer models.

techup2023-pg50-51-art2.png?w=1815
Examples of how NASA uses AI/ML. Satellite images of clouds with estimation of cloud thickness (left) and wildfire detection (right).
techup2023-pg50-51-art3.png?w=2048
NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov)

The Code Analysis Pipeline: Static Analysis Tool for IV&V and Software Quality Improvement

The Code Analysis Pipeline (CAP) is an open-source tool architecture that supports software development and assurance activities, improving overall software quality. The Independent Verification and Validation (IV&V) Program is using CAP to support software assurance on the Human Landing System, Gateway, Exploration Ground Systems, Orion, and Roman. CAP supports the configuration and automated execution of multiple static code analysis tools to identify potential code defects, generate code metrics that indicate potential areas of quality concern (e.g., cyclomatic complexity), and execute any other tool that analyzes or processes source code. The TDT is focused on integrating Modified Condition/Decision Coverage analysis support for coverage testing. Results from tools are consolidated into a central database and presented in context through a user interface that supports review, query, reporting, and analysis of results as the code matures.

The tool architecture is based on an industry standard DevOps approach for continuous building of source code and running of tools. CAP integrates with GitHub for source code control, uses Jenkins to support automation of analysis builds, and leverages Docker to create standard and custom build environments that support unique mission needs and use cases.

Improving Software Process & Sharing Best Practices

The TDT has captured the best practice knowledge from across the centers in NPR 7150.2, NASA Software Engineering Requirements, and NASA-HDBK-2203, NASA Software Engineering and Assurance Handbook (https://swehb.nasa.gov.) Two APPEL training classes have been developed and shared with several organizations to give them the foundations in the NPR and software engineering management. The TDT established several subteams to help programs/projects as they tackle software architecture, project management, requirements, cybersecurity, testing and verification, and programmable logic controllers. Many of these teams have developed guidance and best practices, which are documented in NASA-HDBK-2203 and on the NASA Engineering Network.

NPR 7150.2 and the handbook outline best practices over the full lifecycle for all NASA software. This includes requirements development, architecture, design, implementation, and verification. Also covered, and equally important, are the supporting activities/functions that improve quality, including software assurance, safety configuration management, reuse, and software acquisition. Rationale and guidance for the requirements are addressed in the handbook that is internally and externally accessible and regularly updated as new information, tools, and techniques are found and used.

The Software TDT deputies train software engineers, systems engineers, chief engineers, and project managers on the NPR requirements and their role in ensuring these requirements are implemented across NASA centers. Additionally, the TDT deputies train software technical leads on many of the advanced management aspects of a software engineering effort, including planning, cost estimating, negotiating, and handling change management.

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      The NASA Science Mission Directorate (SMD) instituted the Entrepreneurs Challenge to identify innovative ideas and technologies from small business start-ups with the potential to advance the agency’s science goals. Geolabe—a prize winner in the latest Entrepreneurs Challenge—has developed a way to use artificial intelligence to identify global methane emissions. Methane is a greenhouse gas that significantly contributes to global warming, and this promising new technology could provide data to help decision makers develop strategies to mitigate climate change.
      SMD sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023. Challenge winners were awarded prize money—in 2023 the total Entrepreneurs Challenge prize value was $1M. To help leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Numerous challenge winners have subsequently received funding from both NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.
      Each Entrepreneurs Challenge solicited submissions in specific focus areas such as mass spectrometry technology, quantum sensors, metamaterials-based sensor technologies, and more. The focus areas of the latest 2023 challenge included lunar surface payloads and climate science.
      A recent Entrepreneurs Challenge success story involves 2023 challenge winner Geolabe—a startup founded by Dr. Claudia Hulbert and Dr. Bertrand Rouet-Leduc in 2020 in Los Alamos, New Mexico. The Geolabe team developed a method that uses artificial intelligence (AI) to automatically detect methane emissions on a global scale.
      This image taken from a NASA visualization shows the complex patterns of methane emissions around the globe in 2018, based on data from satellites, inventories of human activities, and NASA global computer models. Credit: NASA’s Scientific Visualization Studio As global temperatures rise to record highs, the pressure to curb greenhouse gas emissions has intensified. Limiting methane emissions is particularly important since methane is the second largest contributor to global warming, and is estimated to account for approximately a third of global warming to date. Moreover, because methane stays in the atmosphere for a shorter amount of time compared to CO2, curbing methane emissions is widely considered to be one of the fastest ways to slow down the rate of global warming.
      However, monitoring methane emissions and determining their quantities has been challenging due to the limitations of existing detection methods. Methane plumes are invisible and odorless, so they are typically detected with specialized equipment such as infrared cameras. The difficulty in finding these leaks from space is akin to finding a needle in a haystack. Leaks are distributed around the globe, and most of the methane plumes are relatively small, making them easy to miss in satellite data.
      Multispectral satellite imagery has emerged as a viable methane detection tool in recent years, enabling routine measurements of methane plumes at a global scale every few days. However, with respect to methane, these measurements suffer from very poor signal to noise ratio, which has thus far allowed detection of only very large emissions (2-3 tons/hour) using manual methods.
      This landscape of “mountains” and “valleys” speckled with glittering stars is actually the edge of a nearby, young, star-forming region called NGC 3324 in the Carina Nebula. Captured in infrared light by NASA’s new James Webb Space Telescope, this image reveals for the first time previously invisible areas of star birth. Credit: NASA, ESA, CSA, and STScI The Geolabe team has developed a deep learning architecture that automatically identifies methane signatures in existing open-source spectral satellite data and deconvolves the signal from the noise. This AI method enables automatic detection of methane leaks at 200kg/hour and above, which account for over 85% of the methane emissions in well-studied, large oil and gas basins. Information gained using this new technique could help inform efforts to mitigate methane emissions on Earth and automatically validate their effects. This Geolabe project was featured in Nature Communications on May 14, 2024.
      SPONSORING ORGANIZATION
      NASA Science Mission Directorate
      Share








      Details
      Last Updated Aug 20, 2024 Related Terms
      Earth Science Science-enabling Technology Technology Highlights Uncategorized Explore More
      3 min read Perseverance Pays Off for Student Challenge Winners
      As radioisotopes power the Perseverance rover to explore Mars, perseverance “powered” three winners to write…


      Article


      6 days ago
      3 min read New TEMPO Cosmic Data Story Makes Air Quality Data Publicly Available


      Article


      7 days ago
      3 min read Earth Educators Rendezvous with Infiniscope and Tour It


      Article


      1 week ago
      View the full article
    • By Space Force
      Senior leaders gathered at the AFSA Summit to strategize, enhance innovation, and advance development for the Department of the Air Force.

      View the full article
    • By NASA
      Learn Home New TEMPO Cosmic Data Story… Astrophysics Overview Learning Resources Science Activation Teams SME Map Opportunities More Science Stories Science Activation Highlights Citizen Science   3 min read
      New TEMPO Cosmic Data Story Makes Air Quality Data Publicly Available
      On May 30th, 2024, NASA and the Center for Astrophysics | Harvard & Smithsonian announced the public release of “high-quality, near real-time air quality data” from NASA’s TEMPO (Tropospheric Emissions: Monitoring of Pollution) mission. The NASA Science Activation program’s Cosmic Data Stories team, led by Harvard University in Cambridge, MA, has since released a new “Data Story” – an interactive, digital showcase of new science imagery, including ideas for exploration and scientific highlights shared in a brief video and narrative text – that provides a quick and easy way for the public to visualize this important, large data set from TEMPO.
      TEMPO allows unprecedented monitoring of air quality down to neighborhood scales, with its hourly daytime scans over North America. Air pollutants like NO2, produced, for example, by the burning of fossil fuels, can trigger significant health issues, especially among people with pre-existing illnesses such as asthma. The interactive views in the TEMPO Data Story provide public access to the same authentic data that scientists use and invite the public to explore patterns in their local air quality. For example, how do NO2 emissions vary in our area throughout the day and week? What are possible sources of NO2 in our community? How does our air quality compare with that of other communities with similar population densities, or with nearby urban or rural communities? TEMPO’s hyper-localized data will allow communities to make informed decisions and take action to improve their air quality.
      The Cosmic Data Story team is grateful to TEMPO scientists, Xiong Liu and Caroline Nowlan, for providing the team with early access to the data and guidance on NO2 phenomena that learners can explore in the data. The TEMPO Data Story, featured on TEMPO’s webpage for the public, adds Earth science data to the portfolio of Cosmic Data Stories that is already making astrophysics data accessible to the public.
      TEMPO Team Atmospheric Physicist from the Harvard-Smithsonian Center for Astrophysics, Caroline Nowlan, had this to say: “TEMPO produces data that are really useful for scientists, but are also important for the general public and policy makers. We are thrilled that the Cosmic Data Stories team has made a tool that allows everyone to explore TEMPO data and learn about pollution across North America and in their own communities.”
      The Cosmic Data Stories project is supported by NASA under cooperative agreement award number 80NSSC21M0002 and is part of NASA’s Science Activation Portfolio. Learn more about how Science Activation connects NASA science experts, real content, and experiences with community leaders to do science in ways that activate minds and promote deeper understanding of our world and beyond: https://science.nasa.gov/learn
      A view from the TEMPO Data Story, shows TEMPO’s NO2 data overlaid on a map of North America. A large plume of NO2, caused by large wildfires, arcs from Northern California all the way to Idaho. Other “hot spots” of NO2 are seen over cities across the US, Canada, and Mexico. Users can view any available date, as well as explore some featured dates and locations that describe phenomena of interest that are visible in the data. Share








      Details
      Last Updated Aug 13, 2024 Editor NASA Science Editorial Team Related Terms
      Astrophysics Earth Science Science Activation Tropospheric Emissions: Monitoring of Pollution (TEMPO) Explore More
      3 min read Earth Educators Rendezvous with Infiniscope and Tour It


      Article


      1 day ago
      2 min read Astro Campers SCoPE Out New Worlds


      Article


      4 days ago
      2 min read Hubble Spotlights a Supernova


      Article


      4 days ago
      Keep Exploring Discover More Topics From NASA
      James Webb Space Telescope


      Webb is the premier observatory of the next decade, serving thousands of astronomers worldwide. It studies every phase in the…


      Perseverance Rover


      This rover and its aerial sidekick were assigned to study the geology of Mars and seek signs of ancient microbial…


      Parker Solar Probe


      On a mission to “touch the Sun,” NASA’s Parker Solar Probe became the first spacecraft to fly through the corona…


      Juno


      NASA’s Juno spacecraft entered orbit around Jupiter in 2016, the first explorer to peer below the planet’s dense clouds to…

      View the full article
    • By NASA
      Manil Maskey (ST11/IMPACT) represented NASA at a discussion on the National Artificial Intelligence Research Resource (NAIRR) Pilot program held on Capitol Hill. The event brought together key members of the House AI Caucus, including Representatives Anna Eshoo, Bill Foster, Haley Stevens, Jim Baird, and Sean Casten. In attendance were several congressional staffers and the director of the National Science Foundation. During the discussion, Dr. Maskey highlighted the AI initiatives of NASA’s Science Mission Directorate (SMD) and emphasized the potential benefits of the NAIRR to NASA’s activities. He also showcased the advancements in SMD’s AI foundation model developments. The event served as a platform for sharing insights and fostering collaboration between NASA, other agencies, and key legislative stakeholders on the future of AI research and its applications.
      View the full article
    • By NASA
      Manil Maskey (ST11/IMPACT) was an invited panelist at the United States Geospatial Intelligence Foundation (USGIF) organized GEOINT Symposium Panel titled “Geo-GPT” for Real-Time Geospatial Discovery. The panel explored the convergence of foundational artificial intelligence models beyond large language models, unveiling the potential for groundbreaking conversational “GeoGPT” capabilities that enable real-time geospatial discovery. The discussion centered on the fusion of language processing, computer vision, and spatial reasoning to enable dynamic and interactive exploration for GEOINT planning and response missions. The panel highlighted how the integration of diverse AI models can enhance the richness and accuracy of geospatial conversational AI experiences. This allows seamless interactions between humans and machines, empowering users to intuitively engage with real-time maps, interrogate them, and receive insights through natural language dialogue. Maskey shared insights on the NASA Science Mission Directorate’s (SMD’s) activities in the development and use of large language models (LLMs) and foundation models.
      The USGIF is an educational foundation dedicated to promoting the geospatial intelligence tradecraft. It aims to develop a stronger GEOINT community by bringing together government, industry, academia, professional organizations, and individuals to address national security challenges through geospatial intelligence. The recording of the panel can be found here. – https://www.youtube.com/watch?v=oHzsIe2Kfmo.

      View the full article
  • Check out these Videos

×
×
  • Create New...