Jump to content

Recommended Posts

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      4 min read
      Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      PRISM’s platform uses AI segmentation to identify and highlight residential structures in a neighborhood. NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups to showcase innovative ideas and technologies with the potential to advance the agency’s science goals. To potentially leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Challenge winners were awarded prize money, and in 2023 the total Entrepreneurs Challenge prize value was $1M. Numerous challenge winners have subsequently refined their products and/or received funding from NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.
      One 2023 Entrepreneurs Challenge winner, PRISM Intelligence (formerly known as Pegasus Intelligence and Space), is using artificial intelligence (AI) and other advances in computer vision to create a new platform that could provide geospatial insights to a broad community.
      Every day, vast amounts of remote sensing data are collected through satellites, drones, and aerial imagery, but for most businesses and individuals, accessing and extracting meaningful insights from this data is nearly impossible.  
      The company’s product—Personal Real-time Insight from Spatial Maps, a.k.a. PRISM—is transforming geospatial data into an easy-to-navigate, queryable world. By leveraging 3D computer vision, geospatial analytics, and AI-driven insights, PRISM creates photorealistic, up-to-date digital environments that anyone can interact with. Users can simply log in and ask natural-language questions to instantly retrieve insights—no advanced Geographic Information System (GIS) expertise is required.
      For example, a pool cleaner looking for business could use PRISM to search for all residential pools in a five-mile radius. A gardener could identify overgrown trees in a community. City officials could search for potholes in their jurisdiction to prioritize repairs, enhance public safety, and mitigate liability risks. This broad level of accessibility brings geospatial intelligence out of the hands of a few and into everyday decision making.
      The core of PRISM’s platform uses radiance fields to convert raw 2D imagery into high-fidelity, dynamic 3D visualizations. These models are then enhanced with AI-powered segmentation, which autonomously identifies and labels objects in the environment—such as roads, vehicles, buildings, and natural features—allowing for seamless search and analysis. The integration of machine learning enables PRISM to refine its reconstructions continuously, improving precision with each dataset. This advanced processing ensures that the platform remains scalable, efficient, and adaptable to various data sources, making it possible to produce large-scale, real-time digital twins of the physical world.
      The PRISM platform’s interface showcasing a 3D digital twin of California State Polytechnic University, Pomona, with AI-powered search and insights. “It’s great being able to push the state of the art in this relatively new domain of radiance fields, evolving it from research to applications that can impact common tasks. From large sets of images, PRISM creates detailed 3D captures that embed more information than the source pictures.” — Maximum Wilder-Smith, Chief Technology Officer, PRISM Intelligence
      Currently the PRISM platform uses proprietary data gathered from aerial imagery over selected areas. PRISM then generates high-resolution digital twins of cities in select regions. The team is aiming to eventually expand the platform to use NASA Earth science data and commercial data, which will enable high-resolution data capture over larger areas, significantly increasing efficiency, coverage, and update frequency. PRISM aims to use the detailed multiband imagery that NASA provides and the high-frequency data that commercial companies provide to make geospatial intelligence more accessible by providing fast, reliable, and up-to-date insights that can be used across multiple industries.
      What sets PRISM apart is its focus on usability. While traditional GIS platforms require specialized training to use, PRISM eliminates these barriers by allowing users to interact with geospatial data through a frictionless, conversational interface.
      The impact of this technology could extend across multiple industries. Professionals in the insurance and appraisal industries have informed the company how the ability to generate precise, 3D assessments of properties could streamline risk evaluations, reduce costs, and improve accuracy—replacing outdated or manual site visits. Similarly, local governments have indicated they could potentially use PRISM to better manage infrastructure, track zoning compliance, and allocate resources based on real-time, high-resolution urban insights. Additionally, scientists could use the consistent updates and layers of three-dimensional data that PRISM can provide to better understand changes to ecosystems and vegetation.
      As PRISM moves forward, the team’s focus remains on scaling its capabilities and expanding its applications. Currently, the team is working to enhance the technical performance of the platform while also adding data sources to enable coverage of more regions. Future iterations will further improve automation of data processing, increasing the speed and efficiency of real-time 3D reconstructions. The team’s goal is to expand access to geospatial insights, ensuring that anyone—from city planners to business owners—can make informed decisions using the best possible data.
      PRISM Intelligence founders Zachary Gaines, Hugo Delgado, and Maximum Wilder-Smith in their California State Polytechnic University, Pomona lab, where the company was first formed. Share








      Details
      Last Updated Apr 21, 2025 Related Terms
      Earth Science Division Earth Science Science-enabling Technology Technology Highlights Explore More
      4 min read NASA Aims to Fly First Quantum Sensor for Gravity Measurements


      Article


      7 days ago
      4 min read GLOBE Mission Earth Supports Career Technical Education


      Article


      2 weeks ago
      4 min read New York Math Teacher Measures Trees & Grows Scientists with GLOBE


      Article


      2 weeks ago
      View the full article
    • By NASA
      Tess Caswell, a stand-in crew member for the Artemis III Virtual Reality Mini-Simulation, executes a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. The simulation was a test of using VR as a training method for flight controllers and science teams’ collaboration on science-focused traverses on the lunar surface. Credit: NASA/Robert Markowitz When astronauts walk on the Moon, they’ll serve as the eyes, hands, and boots-on-the-ground interpreters supporting the broader teams of scientists on Earth. NASA is leveraging virtual reality to provide high-fidelity, cost-effective support to prepare crew members, flight control teams, and science teams for a return to the Moon through its Artemis campaign.
      The Artemis III Geology Team, led by principal investigator Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, participated in an Artemis III Surface Extra-Vehicular VR Mini-Simulation, or “sim” at NASA’s Johnson Space Center in Houston in the fall of 2024. The sim brought together science teams and flight directors and controllers from Mission Control to carry out science-focused moonwalks and test the way the teams communicate with each other and the astronauts.
      “There are two worlds colliding,” said Dr. Matthew Miller, co-lead for the simulation and exploration engineer, Amentum/JETSII contract with NASA. “There is the operational world and the scientific world, and they are becoming one.”
      NASA mission training can include field tests covering areas from navigation and communication to astronaut physical and psychological workloads. Many of these tests take place in remote locations and can require up to a year to plan and large teams to execute. VR may provide an additional option for training that can be planned and executed more quickly to keep up with the demands of preparing to land on the Moon in an environment where time, budgets, and travel resources are limited.
      VR helps us break down some of those limitations and allows us to do more immersive, high-fidelity training without having to go into the field. It provides us with a lot of different, and significantly more, training opportunities.
      BRI SPARKS
      NASA co-lead for the simulation and Extra Vehicular Activity Extended Reality team at Johnson.
      Field testing won’t be going away. Nothing can fully replace the experience crew members gain by being in an environment that puts literal rocks in their hands and incudes the physical challenges that come with moonwalks, but VR has competitive advantages.
      The virtual environment used in the Artemis III VR Mini-Sim was built using actual lunar surface data from one of the Artemis III candidate regions. This allowed the science team to focus on Artemis III science objectives and traverse planning directly applicable to the Moon. Eddie Paddock, engineering VR technical discipline lead at NASA Johnson, and his team used data from NASA’s Lunar Reconnaissance Orbiter and planet position and velocity over time to develop a virtual software representation of a site within the Nobile Rim 1 region near the south pole of the Moon. Two stand-in crew members performed moonwalk traverses in virtual reality in the Prototype Immersive Technology lab at Johnson, and streamed suit-mounted virtual video camera views, hand-held virtual camera imagery, and audio to another location where flight controllers and science support teams simulated ground communications.
      A screen capture of a virtual reality view during the Artemis III VR Mini-Simulation. The lunar surface virtual environment was built using actual lunar surface data from one of the Artemis III candidate regions. Credit: Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. The crew stand-ins were immersed in the lunar environment and could then share the experience with the science and flight control teams. That quick and direct feedback could prove critical to the science and flight control teams as they work to build cohesive teams despite very different approaches to their work.
      The flight operations team and the science team are learning how to work together and speak a shared language. Both teams are pivotal parts of the overall mission operations. The flight control team focuses on maintaining crew and vehicle safety and minimizing risk as much as possible. The science team, as Miller explains, is “relentlessly thirsty” for as much science as possible. Training sessions like this simulation allow the teams to hone their relationships and processes.
      Members of the Artemis III Geology Team and science support team work in a mock Science Evaluation Room during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Video feeds from the stand-in crew members’ VR headsets allow the science team to follow, assess, and direct moonwalks and science activities. Credit: NASA/Robert Markowitz Denevi described the flight control team as a “well-oiled machine” and praised their dedication to getting it right for the science team. Many members of the flight control team have participated in field and classroom training to learn more about geology and better understand the science objectives for Artemis.
      “They have invested a lot of their own effort into understanding the science background and science objectives, and the science team really appreciates that and wants to make sure they are also learning to operate in the best way we can to support the flight control team, because there’s a lot for us to learn as well,” Denevi said. “It’s a joy to get to share the science with them and have them be excited to help us implement it all.”
      Artemis III Geology Team lead Dr. Brett Denevi of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, left, Artemis III Geology Team member, Dr. Jose Hurtado, University of Texas at El Paso, and simulation co-lead, Bri Sparks, work together during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz This simulation, Sparks said, was just the beginning for how virtual reality could supplement training opportunities for Artemis science. In the future, using mixed reality could help take the experience to the next level, allowing crew members to be fully immersed in the virtual environment while interacting with real objects they can hold in their hands. Now that the Nobile Rim 1 landing site is built in VR, it can continue to be improved and used for crew training, something that Sparks said can’t be done with field training on Earth.
      While “virtual” was part of the title for this exercise, its applications are very real.
      “We are uncovering a lot of things that people probably had in the back of their head as something we’d need to deal with in the future,” Miller said. “But guess what? The future is now. This is now.”
      Test subject crew members for the Artemis III Virtual Reality Mini-Simulation, including Grier Wilt, left, and Tess Caswell, center, execute a moonwalk in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Grier Wilt, left, and Tess Caswell, crew stand-ins for the Artemis III Virtual Reality Mini-Simulation, execute a moonwalk in the Prototype Immersive Technology (PIT) lab at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz Engineering VR technical discipline lead Eddie Paddock works with team members to facilitate the virtual reality components of the Artemis III Virtual Reality Mini-Simulation in the Prototype Immersive Technology lab at NASA’s Johnson Space Center in Houston. Credit: Robert Markowitz Flight director Paul Konyha follows moonwalk activities during the Artemis III Virtual Reality Mini-Simulation at NASA’s Johnson Space Center in Houston. Credit: NASA/Robert Markowitz




      Rachel Barry
      NASA’s Johnson Space Center
      Keep Exploring Discover More Topics From NASA
      Astromaterials



      Artemis Science


      A Time Capsule The Moon is a 4.5-billion-year-old time capsule, pristinely preserved by the cold vacuum of space. It is…


      Lunar Craters


      Earth’s Moon is covered in craters. Lunar craters tell us the history not only of the Moon, but of our…


      Solar System


      View the full article
    • By European Space Agency
      The methane emitted in 2022 by the damaged Nord Stream gas pipelines was more than double the volume estimated at the time, according to a study published in Nature.
      View the full article
    • By European Space Agency
      A list of the top 10 global regions where natural or anthropogenic sources emit methane on a continuous, ‘persistent’ basis was recently published in a scientific journal.
      View the full article
    • By USH
      The American Meteor Society website shared a video on their channel showing a fireball streaking across the skies of Michigan and Ohio on Sunday, January 19, 2025, around 01:31 UT. 

      Though, Meteor Society noted that the video might not actually depict a fireball event, leaving some viewers curious about the meaning behind this statement. 
      At the moment the fireball appears on camera, a strange object seems to materialize above it, expanding in size and partially obscuring the fireball before gradually fading out as the fireball continues its path through the sky. 
      This phenomenon has sparked varied interpretations. Some suggest it might indicate alien intervention, while others offer a more plausible explanation: the "object" is likely a water droplet on the camera lens, creating the illusion of interaction with the fireball. 
      However, since the Meteor Society suggested that it might not actually depict a fireball event, we might question whether it was truly a fireball, a meteor, including a water droplet, or something entirely different.
        View the full article
  • Check out these Videos

×
×
  • Create New...