Jump to content

Recommended Posts

  • Publishers
Posted

5 min read

Preparations for Next Moonwalk Simulations Underway (and Underwater)

human-landing-system-2024-surface-astron
This artist’s concept shows astronauts working on the Moon alongside different technology systems. The Data & Reasoning Fabric technology could help these systems operate in harmony, supporting the astronauts and ground control on Earth.
Credit: NASA

Imagine your car is in conversation with other traffic and road signals as you travel. Those conversations help your car anticipate actions you can’t see: the sudden slowing of a truck as it begins to turn ahead of you, or an obscured traffic signal turning red. Meanwhile, this system has plotted a course that will drive you toward a station to recharge or refuel, while a conversation with a weather service prepares your windshield wipers and brakes for the rain ahead.

This trip requires a lot of communication among systems from companies, government agencies, and organizations. How might these different entities – each with their own proprietary technology – share data safely in real time to make your trip safe, efficient, and enjoyable?

Technologists at NASA’s Ames Research Center in California’s Silicon Valley created a framework called Data & Reasoning Fabric (DRF), a set of software infrastructure, tools, protocols, governance, and policies that allow safe, secure data sharing and logical prediction-making across different operators and machines. Originally developed with a focus on providing autonomous aviation drones with decision-making capabilities, DRF is now being explored for other applications.

This means that one day, DRF-informed technology could allow your car to receive traffic data safely and securely from nearby stoplights and share data with other vehicles on the road. In this scenario, DRF is the choreographer of a complex dance of moving objects, ensuring each moves seamlessly in relation to one another towards a shared goal. The system is designed to create an integrated environment, combining data from systems that would otherwise be unable to interact with each other.

“DRF is built to be used behind the scenes,” said David Alfano, chief of the Intelligent Systems Division at Ames. “Companies are developing autonomous technology, but their systems aren’t designed to work with technology from competitors. The DRF technology bridges that gap, organizing these systems to work together in harmony.”

Traffic enhancements are just one use case for this innovative system. The technology could enhance how we use autonomy to support human needs on Earth, in the air, and even on the Moon.

Supporting Complex Logistics

To illustrate the technology’s impact, the DRF team worked with the city of Phoenix on an aviation solution to improve transportation of critical medical supplies from urban areas out to rural communities with limited access to these resources. An autonomous system identified where supplies were needed and directed a drone to pick up and transport supplies quickly and safely.

“All the pieces need to come together, which takes a lot of effort. The DRF technology provides a framework where suppliers, medical centers, and drone operators can work together efficiently,” said Moustafa Abdelbaky, senior computer scientist at Ames. “The goal isn’t to remove human involvement, but help humans achieve more.”

The DRF technology is part of a larger effort at Ames to develop concepts that enable autonomous operations while integrating them into the public and commercial sector to create safer, efficient environments.

“At NASA, we’re always learning something. There’s a silver lining when one project ends, you can identify a new lesson learned, a new application, or a new economic opportunity to continue and scale that work,” said Supreet Kaur, lead systems engineer at Ames. “And because we leverage all of the knowledge we’ve gained through these experiments, we are able to make future research more robust.”

Choreographed Autonomy

Industries like modern mining involve a variety of autonomous and advanced vehicles and machinery, but these systems face the challenge of communicating sufficiently to operate in the same area. The DRF technology’s “choreography” might help them work together, improving efficiency. Researchers met with a commercial mining company to learn what issues they struggle with when using autonomous equipment to identify where DRF might provide future solutions.

“If an autonomous drill is developed by one company, but the haul trucks are developed by another, those two machines are dancing to two different sets of music. Right now, they need to be kept apart manually for safety,” said Johnathan Stock, chief scientist for innovation at the Ames Intelligent Systems Division. “The DRF technology can harmonize their autonomous work so these mining companies can use autonomy across the board to create a safer, more effective enterprise.”

Further testing of DRF on equipment like those used in mines could be done at the NASA Ames Roverscape, a surface that includes obstacles such as slopes and rocks, where DRF’s choreography could be put to the test.

Stock also envisions DRF improving operations on the Moon. Autonomous vehicles could transport materials, drill, and excavate, while launch vehicles come and go. These operations will likely include systems from different companies or industries and could be choreographed by DRF.

As autonomous systems and technologies increase across markets, on Earth, in orbit, and on the Moon, DRF researchers are ready to step on the dance floor to make sure everything runs smoothly.

“When everyone’s dancing to the same tune, things run seamlessly, and more is possible.”

Share

Details

Last Updated
Mar 20, 2025

Related Terms

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By USH
      The photograph was captured by the Mast Camera (Mastcam) aboard NASA’s Curiosity rover on Sol 3551 (August 2, 2022, at 20:43:28 UTC). 

      What stands out in the image are two objects, that appear strikingly out of place amid the natural Martian landscape of rocks and boulders. Their sharp edges, right angles, flat surfaces, and geometric symmetry suggest they may have been shaped by advanced cutting tools rather than natural erosion. 

      Could these ancient remnants be part of a destroyed structure or sculpture? If so, they may serve as yet another piece of evidence pointing to the possibility that Mars was once home to an intelligent civilization, perhaps even the advanced humanoid beings who, according to some theories, fled the catastrophic destruction of planet Maldek and sought refuge on the Red Planet. 
      Objects discovered by Jean Ward Watch Jean Ward's YouTube video on this topic: HereSee original NASA source: Here 
      View the full article
    • By European Space Agency
      ESA Impact: Pick of our spring space snaps

      View the full article
    • By NASA
      NASA named Stanford University of California winner of the Lunar Autonomy Challenge, a six-month competition for U.S. college and university student teams to virtually map and explore using a digital twin of NASA’s In-Situ Resource Utilization Pilot Excavator (IPEx). 
      The winning team successfully demonstrated the design and functionality of their autonomous agent, or software that performs specified actions without human intervention. Their agent autonomously navigated the IPEx digital twin in the virtual lunar environment, while accurately mapping the surface, correctly identifying obstacles, and effectively managing available power.
      Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Team photo of NAV Lab Lunar Autonomy Challenge from Stanford UniversityCredit: Stanford University’s NAV Lab team The Lunar Autonomy Challenge has been a truly unique experience. The challenge provided the opportunity to develop and test methods in a highly realistic simulation environment."
      Adam dai
      Lunar Autonomy Challenge team lead, Stanford University

      Dai added, “It pushed us to find solutions robust to the harsh conditions of the lunar surface. I learned so much through the challenge, both about new ideas and methods, as well as through deepening my understanding of core methods across the autonomy stack (perception, localization, mapping, planning). I also very much enjoyed working together with my team to brainstorm different approaches and strategies and solve tangible problems observed in the simulation.” 
      The challenge offered 31 teams a valuable opportunity to gain experience in software development, autonomy, and machine learning using cutting-edge NASA lunar technology. Participants also applied essential skills common to nearly every engineering discipline, including technical writing, collaborative teamwork, and project management.
      The Lunar Autonomy Challenge supports NASA’s Lunar Surface Innovation Initiative (LSII), which is part of the Space Technology Mission Directorate. The LSII aims to accelerate technology development and pursue results that will provide essential infrastructure for lunar exploration by collaborating with industry, academia, and other government agencies.
      The work displayed by all of these teams has been impressive, and the solutions they have developed are beneficial to advancing lunar and Mars surface technologies as we prepare for increasingly complex missions farther from home.” 
      Niki Werkheiser
      Director of Technology Maturation and LSII lead, NASA Headquarters
      “To succeed, we need input from everyone — every idea counts to propel our goals forward. It is very rewarding to see these students and software developers contributing their skills to future lunar and Mars missions,” Werkheiser added.  
      Through the Lunar Autonomy Challenge, NASA collaborated with the Johns Hopkins Applied Physics Laboratory, Caterpillar Inc., and Embodied AI. Each team contributed unique expertise and tools necessary to make the challenge a success.
      The Applied Physics Laboratory managed the challenge for NASA. As a systems integrator for LSII, they provided expertise to streamline rigor and engineering discipline across efforts, ensuring the development of successful, efficient, and cost-effective missions — backed by the world’s largest cohort of lunar scientists. 
      Caterpillar Inc. is known for its construction and excavation equipment and operates a large fleet of autonomous haul trucks. They also have worked with NASA for more than 20 years on a variety of technologies, including autonomy, 3D printing, robotics, and simulators as they continue to collaborate with NASA on technologies that support NASA’s mission objectives and provide value to the mining and construction industries. 
      Embodied AI collaborated with Caterpillar to integrate the simulation into the open-source  driving environment used for the challenge. For the Lunar Autonomy Challenge, the normally available digital assets of the CARLA simulation platform, such as urban layouts, buildings, and vehicles, were replaced by an IPEx “Digital Twin” and lunar environmental models.
      “This collaboration is a great example of how the government, large companies, small businesses, and research institutions can thoughtfully leverage each other’s different, but complementary, strengths,” Werkheiser added. “By substantially modernizing existing tools, we can turn today’s novel technologies into tomorrow’s institutional capabilities for more efficient and effective space exploration, while also stimulating innovation and economic growth on Earth.”

      FINALIST TEAMS
      First Place
      NAV Lab team
      Stanford University, Stanford, California


      Second Place
      MAPLE (MIT Autonomous Pathfinding for Lunar Exploration) team
      Massachusetts Institute of Technology, Cambridge, MA


      Third Place
      Moonlight team
      Carnegie Mellon University, Pittsburgh, PA
      OTHER COMPETING TEAMS
      Lunar ExplorersArizona State UniversityTempe, ArizonaAIWVU West Virginia University Morgantown, West VirginiaStellar Sparks California Polytechnic Institute Pomona Pomona, California LunatiX Johns Hopkins University Whiting School of EngineeringBaltimore CARLA CSU California State University, Stanislaus Turlock, CaliforniaRose-Hulman Rose-Hulman Institute of Technology Terre Haute, IndianaLunar PathfindersAmerican Public University SystemCharles Town, West Virginia Lunar Autonomy Challenge digital simulation of lunar surface activity using a digital twin of NASA’s ISRU Pilot ExcavatorJohns Hopkins Applied Physics Laboratory Keep Exploring Discover More Topics From NASA
      Space Technology Mission Directorate
      NASA’s Lunar Surface Innovation Initiative
      Game Changing Development Projects
      Game Changing Development projects aim to advance space technologies, focusing on advancing capabilities for going to and living in space.
      ISRU Pilot Excavator
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Nemanja Jovanovic, lead instrument scientist at Caltech, presents at the Emerging Technologies for Astrophysics workshop, held at NASA’s Ames Research Center in California’s Silicon Valley. The workshop brought together experts in astrophysics to discuss how advanced technologies could impact future mission planning.NASA/Donald Richey The future of astrophysics research could unlock the secrets of the universe, and emerging technologies like artificial intelligence, quantum sensing, and advanced materials may hold the key to faster, more efficient discovery. Advancements and implementations of new technologies are imperative for observational astrophysics to achieve the next level of detection.
      NASA’s Emerging Technologies for Astrophysics workshop brought together subject matter experts from industry, government, and academia to explore the state of new and disruptive technologies. The meeting was an effort to identify specific applications for astrophysics missions and better understand how their infusion into future NASA space telescopes could be accelerated.
      The workshop took place at NASA’s Ames Research Center in California’s Silicon Valley,. supporting the agency’s efforts to make partnership with public and private industry and collaborative mission planning possible.
      “The profound questions about the nature of our universe that astrophysics at NASA answers require giant leaps in technology,” explained Mario Perez, chief technologist for the Astrophysics Division at NASA Headquarters in Washington. “Spotting potential in early-stage tech by encouraging discussions between imaginative researchers helps expand the scope of science and lessen the time required to achieve the next generation of astrophysics missions.”
      Emerging technologies like artificial intelligence can support the design and optimization of future missions, and participants focused efforts on combining technologies to push research further. “Cross-pollination” of advanced materials like composites with advanced manufacturing, metamaterials, and photonic chips could support advancement in imaging missions beyond existing mechanical stability needs.
      The United Nations Educational, Scientific and Cultural Organization (UNESCO) has dubbed 2025 the “International Year of Quantum Science and Technology” in recognition of a century of quantum mechanics. Workshop participants discussed how quantum sensing could enable more precise measurements, achieve “super resolution” by filling in missing details in lower resolution images, and provide greater capabilities in forthcoming space telescopes.
      “This gathering of experts was an opportunity to find ways where we can increase the capabilities of future space instrumentation and accelerate technology development for infusion into NASA astrophysics missions,” said Naseem Rangwala, astrophysics branch chief at NASA Ames. “We can speed up the process of how we develop these future projects by using the emerging technologies that are incubated right here in Silicon Valley.”
      The findings from this workshop and ongoing discussions will support efforts to study and invest in technologies to advance astrophysics missions with greater speed and efficiency.
      About the Author
      Tara Friesen

      Share
      Details
      Last Updated Apr 29, 2025 Related Terms
      Ames Research Center Astrophysics Astrophysics Division General Science Mission Directorate Explore More
      3 min read Help Classify Galaxies Seen by NASA’s James Webb Space Telescope!
      NASA needs your help identifying the shapes of thousands of galaxies in images taken by…
      Article 2 hours ago 3 min read In the Starlight: Jason Phillips’ Unexpected Path to Johnson Procurement
      Article 6 hours ago 2 min read How Are We Made of Star Stuff? We Asked a NASA Expert: Episode 58
      Article 20 hours ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By USH
      For over 80 years, covert research into exotic propulsion, anti-gravity systems, and spacetime manipulation has been housed within deep black programs, classified efforts shielded from both public and congressional oversight. 

      Now, on April 14, 2025, Michael Katzios, the new White House science chief, made a bold claim: “Our technologies permit us to manipulate time and space...” Shortly after, he doubled down, promising innovations that would let us “bend time and space” and “drive us further into the endless frontier.” These weren’t offhand remarks, they were published on the official White House site, signaling intent. 
      What does "Manipulating Spacetime" really mean? Spacetime is the four-dimensional framework of our universe. Per Einstein’s theory, mass and energy warp this fabric, creating gravity and affecting time. To manipulate it would mean bending reality itself, shortening distances, warping time, or enabling faster-than-light travel. 
      Just days before Katzios’ remarks, President Trump said: “We have a weapon that no one has a clue what it is... more powerful than anything even close.” Was he referencing to a spacetime weapon? 
      Trump isn’t the first high-level figure to hint at such capabilities. Back in 2019, Lt. Gen. Steve Kwast publicly discussed technology capable of transporting a person anywhere on Earth in under an hour, suggesting real-world applications of physics far beyond current norms. He also touched on wireless, space-based energy transmission. 
      Rumors have long circulated about transatmospheric vehicles, craft capable of seamless operation both within Earth’s atmosphere and in space. Though unconfirmed, these platforms may represent a technological bridge between known aerospace systems and genuine spacetime engineering. (Consider Gary McKinnon’s 2002 discovery during his hack of U.S. military systems: references to a secret space fleet and "non-terrestrial officers.") 
      But it is not only about manipulating time and space. 
      What might they also have: Anti-Gravity Propulsion: Altering inertia with plasma or exotic materials, referenced in Navy patents. Warp Drives: Bending space around a craft to move without motion. Zero-Point Energy: Tapping the quantum vacuum for limitless energy, a paradigm-shifting source of power. 
      But why some groups want to keep it secret? There are compelling reasons for secrecy, none of them rooted in public interest: 
      Control of Power – Whoever controls this tech controls the future. Economic Impact – It would collapse the fossil fuel, aviation, and defense sectors. Weaponization Risk – These tools could be catastrophic in the wrong hands. Psychological Shock – It would rewrite everything we know about science and our place in the cosmos. 
      Despite growing testimony and a trove of leaked documents, officials continue to dismiss these claims. The Deep State line remains unchanged: “No empirical evidence exists for reverse-engineering extraterrestrial technology.” But the evidence says otherwise. 
      Supporting evidence: 1. Exotic materials reportedly recovered in 1950s, held by Lockheed. 2. The 1953 Robertson Panel even set the tone for decades of deliberate obfuscation, publicly debunking UFOs while secretly studying their implications. The CIA used Project Blue Book to publicly debunk UFOs. 3. As early as 1966, the U.S. Air Force reportedly managed over 30 classified anti-gravity projects.  4. A 1971 Australian Defense report referenced America’s "Advanced Saucer Aircraft" and a Cold War “UFO crash program” into anti-gravity propulsion. 5. The US government, through its CIA's Office of Global Access (OGA), is reported to have a secret program to retrieve and reverse-engineer crashed UFOs. This program, which began in 2003, is said to have recovered at least nine non-human aircraft, some of which were intact. The OGA works with special operations forces like SEAL teams to conduct these retrievals, keeping the operations highly secret. 6. CIA allegedly blocked a 2024 transfer of exotic materials from Lockheed to Bigelow Aerospace. 
      Ben Rich, former head of Lockheed Skunk Works, reportedly stated: “We now have the technology to take ET home.” 
      Don Phillips, also from Lockheed, confirmed reverse-engineering efforts related to recovered UFO craft, allegedly including materials from the infamous 1947 Roswell incident.  
      Dr. Salvatore Pais, a Navy scientist, filed patents (2016–2019) for highly unconventional devices, including a Space-Time Modification Weapon. These patents describe the use of electromagnetic fields, plasma, and rotational force fields. Theoretically, this device could create a spacetime modification weapon more powerful than hydrogen bombs. The Navy invested USD 508,000 testing the concept between 2016-2019. 
      But what could be the reason they are starting to reveal it now? The sudden shift toward public statements about advanced capabilities seems deliberate. 
      Consider the possible motives: 1. Strategic Signaling: A subtle warning to adversaries: “We possess technology beyond your reach.” 2.Controlled Disclosure: Shaping the narrative gradually to maintain public trust and institutional control. 3. Leaks Are Coming: Private-sector breakthroughs or whistleblowers may soon expose the truth. 4. Justifying Black Budgets: Revealing exotic tech lends credibility to decades of hidden spending under national security. 
      But perhaps the most compelling reason: a major event, whether real, staged, or cosmic in nature or eventually an alien contact scenario is on the horizon. This may be phase one of psychological preparation. 
      Finally; the evidence suggests that these exotic advanced technologies already exist, whether reverse-engineered or the result of disruptive physics breakthroughs. But what’s happening now isn’t full disclosure. It’s a carefully managed narrative operation, an information war cloaked in the language of advanced science. 
      References and must watch: Alex Jones and Top Deep State / COG Researcher Daniel Liszt:  https://x.com/RealAlexJones/status/1913354709106098659 Richard Dolan: https://www.youtube.com/watch?v=qd7CIe5wnwQ View the full article
  • Check out these Videos

×
×
  • Create New...