Members Can Post Anonymously On This Site
Shape-Shifting Materials: Evidence of Lockheed Martin's Hidden UFO Technology?
-
Similar Topics
-
By NASA
Landing on the Moon is not easy, particularly when a crew or spacecraft must meet exacting requirements. For Artemis missions to the lunar surface, those requirements include an ability to land within an area about as wide as a football field in any lighting condition amid tough terrain.
NASA’s official lunar landing requirement is to be able to land within 50 meters (164 feet) of the targeted site and developing precision tools and technologies is critically important to mission success.
NASA engineers recently took a major step toward safe and precise landings on the Moon – and eventually Mars and icy worlds – with a successful field test of hazard detection technology at NASA’s Kennedy Space Center Shuttle Landing Facility in Florida.
A joint team from the Aeroscience and Flight Mechanics Division at NASA’s Johnson Space Center’s in Houston and Goddard Space Flight Center in Greenbelt, Maryland, achieved this huge milestone in tests of the Goddard Hazard Detection Lidar from a helicopter at Kennedy in March 2025.
NASA’s Hazard Detection Lidar field test team at Kennedy Space Center’s Shuttle Landing Facility in Florida in March 2025. NASA The new lidar system is one of several sensors being developed as part of NASA’s Safe & Precise Landing – Integrated Capabilities Evolution (SPLICE) Program, a Johnson-managed cross-agency initiative under the Space Technology Mission Directorate to develop next-generation landing technologies for planetary exploration. SPLICE is an integrated descent and landing system composed of avionics, sensors, and algorithms that support specialized navigation, guidance, and image processing techniques. SPLICE is designed to enable landing in hard-to-reach and unknown areas that are of potentially high scientific interest.
The lidar system, which can map an area equivalent to two football fields in just two seconds, is a crucial program component. In real time and compensating for lander motion, it processes 15 million short pulses of laser light to quickly scan surfaces and create real-time, 3D maps of landing sites to support precision landing and hazard avoidance.
Those maps will be read by the SPLICE Descent and Landing Computer, a high-performance multicore computer processor unit that analyzes all SPLICE sensor data and determines the spacecraft’s velocity, altitude, and terrain hazards. It also computes the hazards and determines a safe landing location. The computer was developed by the Avionics Systems Division at Johnson as a platform to test navigation, guidance, and flight software. It previously flew on Blue Origin’s New Shepard booster rocket.
The NASA team prepares the Descent and Landing Computer for Hazard Detection Lidar field testing at Kennedy Space Center. NASA For the field test at Kennedy, Johnson led test operations and provided avionics and guidance, navigation, and control support. Engineers updated the computer’s firmware and software to support command and data interfacing with the lidar system. Team members from Johnson’s Flight Mechanics branch also designed a simplified motion compensation algorithm and NASA’s Jet Propulsion Laboratory in Southern California contributed a hazard detection algorithm, both of which were added to the lidar software by Goddard. Support from NASA contractors Draper Laboratories and Jacobs Engineering played key roles in the test’s success.
Primary flight test objectives were achieved on the first day of testing, allowing the lidar team time to explore different settings and firmware updates to improve system performance. The data confirmed the sensor’s capability in a challenging, vibration-heavy environment, producing usable maps. Preliminary review of the recorded sensor data shows excellent reconstruction of the hazard field terrain.
A Hazard Detection Lidar scan of a simulated hazard field at Kennedy Space Center (left) and a combined 3D map identifying roughness and slope hazards. NASA Beyond lunar applications, SPLICE technologies are being considered for use on Mars Sample Return, the Europa Lander, Commercial Lunar Payload Services flights, and Gateway. The DLC design is also being evaluated for potential avionics upgrades on Artemis systems.
Additionally, SPLICE is supporting software tests for the Advancement of Geometric Methods for Active Terrain Relative Navigation (ATRN) Center Innovation Fund project, which is also part of Johnson’s Aeroscience and Flight Mechanics Division. The ATRN is working to develop algorithms and software that can use data from any active sensor – one measuring signals that were reflected, refracted, or scattered by a body’s surface or its atmosphere – to accurately map terrain and provide absolute and relative location information. With this type of system in place, spacecraft will not need external lighting sources to find landing sites.
With additional suborbital flight tests planned through 2026, the SPLICE team is laying the groundwork for safer, more autonomous landings on the Moon, Mars, and beyond. As NASA prepares for its next era of exploration, SPLICE will be a key part of the agency’s evolving landing, guidance, and navigation capabilities.
Explore More
2 min read NASA Gathers Experts to Discuss Emerging Technologies in Astrophysics
Article 2 hours ago 2 min read NASA Technology Enables Leaps in Artificial Intelligence
Artificial intelligence lets machines communicate autonomously
Article 2 hours ago 3 min read In the Starlight: Jason Phillips’ Unexpected Path to Johnson Procurement
Article 7 hours ago View the full article
-
By NASA
Intuitive Machines’ IM-2 captured an image March 6, 2025, after landing in a crater from the Moon’s South Pole. The lunar lander is on its side near the intended landing site, Mons Mouton. In the center of the image between the two lander legs is the Polar Resources Ice Mining Experiment 1 suite, which shows the drill deployed.Intuitive Machines NASA’s PRIME-1 (Polar Resources Ice Mining Experiment 1) mission was designed to demonstrate technologies to help scientists better understand lunar resources ahead of crewed Artemis missions to the Moon. During the short-lived mission on the Moon, the performance of PRIME-1’s technology gave NASA teams reason to celebrate.
“The PRIME-1 mission proved that our hardware works in the harshest environment we’ve ever tested it in,” said Janine Captain, PRIME-1 co-principal investigator and research chemist at NASA’s Kennedy Space Center in Florida. “While it may not have gone exactly to plan, this is a huge step forward as we prepare to send astronauts back to the Moon and build a sustainable future there.”
Intuitive Machines’ IM-2 mission launched to the Moon on Feb. 26, 2025, from NASA Kennedy’s Launch Complex 39A, as part of the company’s second Moon delivery for NASA under the agency’s CLPS (Commercial Lunar Payload Services) initiative and Artemis campaign. The IM-2 Nova-C lunar lander, named Athena, carried PRIME-1 and its suite of two instruments: a drill known as TRIDENT (The Regolith and Ice Drill for Exploring New Terrain), designed to bring lunar soil to the surface; and a mass spectrometer, Mass Spectrometer Observing Lunar Operations (MSOLO), to study TRIDENT’s drill cuttings for the presence of gases that could one day help provide propellant or breathable oxygen to future Artemis explorers.
The IM-2 mission touched down on the lunar surface on March 6, just around 1,300 feet (400 meters) from its intended landing site of Mons Mouton, a lunar plateau near the Moon’s South Pole. The Athena lander was resting on its side inside a crater preventing it from recharging its solar cells, resulting in an end of the mission.
“We were supposed to have 10 days of operation on the Moon, and what we got was closer to 10 hours,” said Julie Kleinhenz, NASA’s lead systems engineer for PRIME-1, as well as the in-situ resource utilization system capability lead deputy for the agency. “It was 10 hours more than most people get so I am thrilled to have been a part of it.”
Kleinhenz has spent nearly 20 years working on how to use lunar resources for sustained operations. In-situ resource utilization harnesses local natural resources at mission destinations. This enables fewer launches and resupply missions and significantly reduces the mass, cost, and risk of space exploration. With NASA poised to send humans back to the Moon and on to Mars, generating products for life support, propellants, construction, and energy from local materials will become increasingly important to future mission success.
“In-situ resource utilization is the key to unlocking long-term exploration, and PRIME-1 is helping us lay this foundation for future travelers.” Captain said.
The PRIME-1 technology also set out to answer questions about the properties of lunar regolith, such as soil strength. This data could help inform the design of in-situ resource utilization systems that would use local resources to create everything from landing pads to rocket fuel during Artemis and later missions.
“Once we got to the lunar surface, TRIDENT and MSOLO both started right up, and performed perfectly. From a technology demonstrations standpoint, 100% of the instruments worked.” Kleinhenz said.
The lightweight, low-power augering drill built by Honeybee Robotics, known as TRIDENT, is 1 meter long and features rotary and percussive actuators that convert energy into the force needed to drill. The drill was designed to stop at any depth as commanded from the ground and deposit its sample on the surface for analysis by MSOLO, a commercial off-the-shelf mass spectrometer modified by engineers and technicians at NASA Kennedy to withstand the harsh lunar environment. Designed to measure the composition of gases in the vicinity of the lunar lander, both from the lander and from the ambient exosphere, MSOLO can help NASA analyze the chemical makeup of the lunar soil and study water on the surface of the Moon.
Once on the Moon, the actuators on the drill performed as designed, completing multiple stages of movement necessary to drill into the lunar surface. Prompted by commands from technicians on Earth, the auger rotated, the drill extended to its full range, the percussion system performed a hammering motion, and the PRIME-1 team turned on an embedded core heater in the drill and used internal thermal sensors to monitor the temperature change.
While MSOLO was able to perform several scans to detect gases, researchers believe from the initial data that the gases detected were all anthropogenic, or human in origin, such as gases vented from spacecraft propellants and traces of Earth water. Data from PRIME-1 accounted for some of the approximately 7.5 gigabytes of data collected during the IM-2 mission, and researchers will continue to analyze the data in the coming months and publish the results.
View the full article
-
By NASA
2 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Multinational corporations are using the M2M Intelligence platform in data centers and other settings. The system offers automated, secure communications on a ground-based global 5G network. Getty Images Artificial intelligence (AI) is advancing rapidly, as intelligent software proves capable of various tasks. The technology usually requires a “human in the loop” to train it and ensure accuracy. But long before the arrival of today’s generative artificial intelligence, a different kind of AI was born with the help of NASA’s Ames Research Center in California’s Silicon Valley — one that only exists between machines, running without any human intervention.
In 2006, Geoffrey Barnard founded Machine-to-Machine Intelligence Corp. (M2Mi) at Ames’ NASA Research Park, envisioning an automated, satellite-based communication network. NASA Ames established a Space Act Agreement with the company to develop artificial intelligence that would automate communications, privacy, security, and resiliency between satellites and ground-based computers.
Central to the technology was automating a problem-solving approach known as root cause analysis, which NASA has honed over decades. This methodology seeks to identify not only the immediate cause of a problem but also all the factors that contributed to the cause. This would allow a network to identify its own issues and fix itself.
NASA Ames’ director of nanotechnology at the time wanted to develop a communications network based on small, low-powered satellites, so Ames supported M2Mi in developing the necessary technology.
Barnard, now CEO and chief technology officer of Tiburon, California-based branch of M2Mi, said NASA’s support laid the foundation for his company, which employs the same technology in a ground-based network.
The company’s M2M Intelligence software performs secure, resilient, automated communications on a system that runs across hundreds of networks, connecting thousands of devices, many of which were not built to communicate with each other. The M2Mi company worked with Vodafone of Berkshire, England, to build a worldwide network across more than 500 smaller networks in over 190 countries. The companies M2M Wireless and TriGlobal have begun using M2M Intelligence for transportation logistics.
With NASA’s help, emerging industries are getting the boost they need to rapidly develop technologies to enhance our lives.
Read More Share
Details
Last Updated Apr 29, 2025 Related Terms
Technology Transfer & Spinoffs Spinoffs Technology Transfer Explore More
2 min read NASA Engineering Sparks Innovative New Battery
Nickel-hydrogen technology is safe, durable, and long-lasting – now it’s affordable too.
Article 5 days ago 2 min read NASA Tech Developed for Home Health Monitoring
Article 3 weeks ago 2 min read NASA Cloud Software Helps Companies Find their Place in Space
Article 1 month ago Keep Exploring Discover Related Topics
Missions
Artificial Intelligence for Science
NASA is creating artificial intelligence tools to help researchers use NASA’s science data more effectively.
Ames Research Center
Solar System
View the full article
-
By NASA
4 Min Read Navigation Technology
ESA astronaut Matthias Maurer sets up an Astrobee for the ReSWARM experiment. Credits: NASA Science in Space April 2025
Humans have always been explorers, venturing by land and sea into unknown and uncharted places on Earth and, more recently, in space. Early adventurers often navigated by the Sun and stars, creating maps that made it easier for others to follow. Today, travelers on Earth have sophisticated technology to guide them.
Navigation in space, including for missions to explore the Moon and Mars, remains more of a challenge. Research on the International Space Station is helping NASA scientists improve navigation tools and processes for crewed spacecraft and remotely controlled or autonomous robots to help people boldly venture farther into space, successfully explore there, and safely return home.
NASA astronaut Nichole Ayers talks to students on the ground using ham radio equipment.NASA A current investigation, NAVCOM, uses the space station’s ISS Ham Radio program hardware to test software for a system that could shape future lunar navigation. The technology processes signals in the same way as global navigation satellite systems such as GPS, but while those rely on constellations of satellites, the NAVCOM radio equipment receives position and time information from ground stations and reference clocks.
The old made new
ESA astronaut Alexander Gerst operates the Sextant Navigation device.NASA Sextant Navigation tested star-sighting from space using a hand-held sextant. These mechanical devices measure the angle between two objects, typically the Sun or other stars at night and the horizon. Sextants guided navigators on Earth for centuries and NASA’s Gemini and Apollo missions demonstrated that they were useful in space as well, meaning they could provide emergency backup navigation for lunar missions. Researchers report that with minimal training and practice, crew members of different skill levels produced quality sightings through a station window and measurements improved with more use. The investigation identified several techniques for improving sightings, including refocusing between readings and adjusting the sight to the center of the window.
Navigating by neutron stars
The station’s NICER instrument studies the nature and behavior of neutron stars, the densest objects in the universe. Some neutron stars, known as pulsars, emit beams of light that appear to pulse, sweeping across the sky as the stars rotate. Some of them pulse at rates as accurate as atomic clocks. As part of the NICER investigation, the Station Explorer for X-ray Timing and Navigation Technology or SEXTANT tested technology for using pulsars in GPS-like systems to navigate anywhere in the solar system. SEXTANT successfully completed a first in-space demonstration of this technology in 2017. In 2018, researchers reported that real-time, autonomous X-ray pulsar navigation is clearly feasible and they plan further experiments to fine tune and modify the technology.
Robot navigation
Crews on future space exploration missions need efficient and safe ways to handle cargo and to move and assemble structures on the surface of the Moon or Mars. Robots are promising tools for these functions but must be able to navigate their surroundings, whether autonomously or via remote control, often in proximity with other robots and within the confines of a spacecraft. Several investigations have focused on improving navigation by robotic helpers.
NASA astronaut Michael Barratt (left) and JAXA astronaut Koichi Wakata perform a check of the SPHERES robots.NASA The SPHERES investigation tested autonomous rendezvous and docking maneuvers with three spherical free-flying robots on the station. Researchers reported development of an approach to control how the robots navigate around obstacles and along a designated path, which could support their use in the future for satellite servicing, vehicle assembly, and spacecraft formation flying.
NASA astronaut Megan McArthur with the three Astrobee robots.NASA The station later gained three cube-shaped robots known as Astrobees. The ReSWARM experiments used them to test coordination of multiple robots with each other, cargo, and their environment. Results provide a base set of planning and control tools for robotic navigation in close proximity and outline important considerations for the design of future autonomous free-flyers.
Researchers also used the Astrobees to show that models to predict the robots’ behavior could make it possible to maneuver one or two of them for carrying cargo. This finding suggests that robots can navigate around each other to perform tasks without a human present, which would increase their usefulness on future missions.
ESA astronaut Samantha Cristoforetti working on the Surface Avatar experiment.ESA An investigation from ESA (European Space Agency), Surface Avatar evaluated orbit-to-ground remote control of multiple robots. Crew members successfully navigated a four-legged robot, Bert, through a simulated Mars environment. Robots with legs rather than wheels could explore uneven lunar and planetary surfaces that are inaccessible to wheeled rovers. The German Aerospace Center is developing Bert.
View the full article
-
By NASA
3 min read
Preparations for Next Moonwalk Simulations Underway (and Underwater)
Drones were a key part of testing new technology in support of a prescribed burn in Geneva State Forest, which is about 100 miles south of Montgomery, Alabama. The effort is part of the agency’s multi-year FireSense project, which is aimed at testing technologies that could eventually serve the U.S. Forest Service as well as local, state, and other federal wildland fire agencies. From left are Tim Wallace and Michael Filicchia of the Desert Research Institute in Nevada; Derek Abramson, Justin Hall, and Alexander Jaffe of NASA’s Armstrong Flight Research Center in Edwards California; and Alana Dachtler of International Met Systems of Kentwood, Michigan.NASA/Jackie Shuman Advancements in NASA’s airborne technology have made it possible to gather localized wind data and assess its impacts on smoke and fire behavior. This information could improve wildland fire decision making and enable operational agencies to better allocate firefighters and resources. A small team from NASA’s Armstrong Flight Research Center in Edwards, California, is demonstrating how some of these technologies work.
Two instruments from NASA’s Langley Research Center in Hampton, Virginia – a sensor gathering 3D wind data and a radiosonde that measures temperature, barometric pressure, and humidity data – were installed on NASA Armstrong’s Alta X drone for a prescribed burn in Geneva State Forest, which is about 100 miles south of Montgomery, Alabama. The effort is part of the agency’s multi-year FireSense project, which is aimed at testing technologies that could eventually serve the U.S. Forest Service as well as local, state, and other federal wildland fire agencies.
“The objectives for the Alta X portion of the multi-agency prescribed burn include a technical demonstration for wildland fire practitioners, and data collection at various altitudes for the Alabama Forestry Commission operations,” said Jennifer Fowler, FireSense project manager. “Information gathered at the different altitudes is essential to monitor the variables for a prescribed burn.”
Those variables include the mixing height, which is the extent or depth to which smoke will be dispersed, a metric Fowler said is difficult to predict. Humidity must also be above 30% for a prescribed burn. The technology to collect these measurements locally is not readily available in wildland fire operations, making the Alta X and its instruments key in the demonstration of prescribed burn technology.
A drone from NASA’s Armstrong Flight Research Center, Edwards, California, flies with a sensor to gather 3D wind data and a radiosonde that measures temperature, barometric pressure, and humidity data from NASA’s Langley Research Center in Hampton, Virginia. The drone and instruments supported a prescribed burn in Geneva State Forest, which is about 100 miles south of Montgomery, Alabama. The effort is part of the agency’s multi-year FireSense project, which is aimed at testing technologies that could eventually serve the U.S. Forest Service as well as local, state, and other federal wildland fire agencies.International Met Systems/Alana Dachtler In addition to the Alta X flights beginning March 25, NASA Armstrong’s B200 King Air will fly over actively burning fires at an altitude of about 6,500 feet. Sensors onboard other aircraft supporting the mission will fly at lower altitudes during the fire, and at higher altitudes before and after the fire for required data collection. The multi-agency mission will provide data to confirm and adjust the prescribed burn forecast model.
Small, uncrewed aircraft system pilots from NASA Armstrong completed final preparations to travel to Alabama and set up for the research flights. The team – including Derek Abramson, chief engineer for the subscale flight research laboratory; Justin Hall, NASA Armstrong chief pilot of small, uncrewed aircraft systems; and Alexander Jaffe, a drone pilot – will set up, fly, observe airborne operations, all while keeping additional aircraft batteries charged. The launch and recovery of the Alta X is manual, the mission profile is flown autonomously to guarantee the same conditions for data collection.
“The flight profile is vertical – straight up and straight back down from the surface to about 3,000 feet altitude,” Abramson said. “We will characterize the mixing height and changes in moisture, mapping out how they both change throughout the day in connection with the burn.”
In August 2024, a team of NASA researchers used the NASA Langley Alta X and weather instruments in Missoula, Montana, for a FireSense project drone technology demonstration. These instruments were used to generate localized forecasting that provides precise and sustainable meteorological data to predict fire behavior and smoke impacts.
Justin Link, left, pilot for small uncrewed aircraft systems, and Justin Hall, chief pilot for small uncrewed aircraft systems, install weather instruments on an Alta X drone at NASAs Armstrong Flight Research Center in Edwards, California. Members of the center’s Dale Reed Subscale Flight Research Laboratory used the Alta X to support the agency’s FireSense project in March 2025 for a prescribed burn in Geneva State Forest, which is about 100 miles south of Montgomery, Alabama.NASA/Steve Freeman Share
Details
Last Updated Apr 03, 2025 EditorDede DiniusContactJay Levinejay.levine-1@nasa.govLocationArmstrong Flight Research Center Related Terms
Armstrong Flight Research Center Airborne Science B200 Drones & You Langley Research Center Science Mission Directorate Explore More
5 min read NASA Langley’s Legacy of Landing
Article 7 hours ago 4 min read NASA Makes Progress on Advanced Drone Safety Management System
Article 23 hours ago 2 min read What Are the Dangers of Going to Space? We Asked a NASA Expert: Episode 55
Article 1 day ago Keep Exploring Discover More Topics From NASA
Armstrong Flight Research Center
Humans in Space
Climate Change
Solar System
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.