Jump to content

Recommended Posts

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      Landing on the Moon is not easy, particularly when a crew or spacecraft must meet exacting requirements. For Artemis missions to the lunar surface, those requirements include an ability to land within an area about as wide as a football field in any lighting condition amid tough terrain.

      NASA’s official lunar landing requirement is to be able to land within 50 meters (164 feet) of the targeted site and developing precision tools and technologies is critically important to mission success.

      NASA engineers recently took a major step toward safe and precise landings on the Moon – and eventually Mars and icy worlds – with a successful field test of hazard detection technology at NASA’s Kennedy Space Center Shuttle Landing Facility in Florida.

      A joint team from the Aeroscience and Flight Mechanics Division at NASA’s Johnson Space Center’s in Houston and Goddard Space Flight Center in Greenbelt, Maryland, achieved this huge milestone in tests  of the Goddard Hazard Detection Lidar from a helicopter at Kennedy in March 2025. 

      NASA’s Hazard Detection Lidar field test team at Kennedy Space Center’s Shuttle Landing Facility in Florida in March 2025. NASA The new lidar system is one of several sensors being developed as part of NASA’s Safe & Precise Landing – Integrated Capabilities Evolution (SPLICE) Program, a Johnson-managed cross-agency initiative under the Space Technology Mission Directorate to develop next-generation landing technologies for planetary exploration. SPLICE is an integrated descent and landing system composed of avionics, sensors, and algorithms that support specialized navigation, guidance, and image processing techniques. SPLICE is designed to enable landing in hard-to-reach and unknown areas that are of potentially high scientific interest.

      The lidar system, which can map an area equivalent to two football fields in just two seconds, is a crucial program component. In real time and compensating for lander motion, it processes 15 million short pulses of laser light to quickly scan surfaces and create real-time, 3D maps of landing sites to support precision landing and hazard avoidance. 

      Those maps will be read by the SPLICE Descent and Landing Computer, a high-performance multicore computer processor unit that analyzes all SPLICE sensor data and determines the spacecraft’s velocity, altitude, and terrain hazards. It also computes the hazards and determines a safe landing location. The computer was developed by the Avionics Systems Division at Johnson as a platform to test navigation, guidance, and flight software. It previously flew on Blue Origin’s New Shepard booster rocket.

      The NASA team prepares the Descent and Landing Computer for Hazard Detection Lidar field testing at Kennedy Space Center. NASA For the field test at Kennedy, Johnson led test operations and provided avionics and guidance, navigation, and control support. Engineers updated the computer’s firmware and software to support command and data interfacing with the lidar system. Team members from Johnson’s Flight Mechanics branch also designed a simplified motion compensation algorithm and NASA’s Jet Propulsion Laboratory in Southern California contributed a hazard detection algorithm, both of which were added to the lidar software by Goddard. Support from NASA contractors Draper Laboratories and Jacobs Engineering played key roles in the test’s success.

      Primary flight test objectives were achieved on the first day of testing, allowing the lidar team time to explore different settings and firmware updates to improve system performance. The data confirmed the sensor’s capability in a challenging, vibration-heavy environment, producing usable maps. Preliminary review of the recorded sensor data shows excellent reconstruction of the hazard field terrain.

      A Hazard Detection Lidar scan of a simulated hazard field at Kennedy Space Center (left) and a combined 3D map identifying roughness and slope hazards. NASA Beyond lunar applications, SPLICE technologies are being considered for use on Mars Sample Return, the Europa Lander, Commercial Lunar Payload Services flights, and Gateway. The DLC design is also being evaluated for potential avionics upgrades on Artemis systems.

      Additionally, SPLICE is supporting software tests for the Advancement of Geometric Methods for Active Terrain Relative Navigation (ATRN) Center Innovation Fund project, which is also part of Johnson’s Aeroscience and Flight Mechanics Division. The ATRN is working to develop algorithms and software that can use data from any active sensor – one measuring signals that were reflected, refracted, or scattered by a body’s surface or its atmosphere – to accurately map terrain and provide absolute and relative location information. With this type of system in place, spacecraft will not need external lighting sources to find landing sites.

      With additional suborbital flight tests planned through 2026, the SPLICE team is laying the groundwork for safer, more autonomous landings on the Moon, Mars, and beyond. As NASA prepares for its next era of exploration, SPLICE will be a key part of the agency’s evolving landing, guidance, and navigation capabilities.
      Explore More
      2 min read NASA Gathers Experts to Discuss Emerging Technologies in Astrophysics
      Article 2 hours ago 2 min read NASA Technology Enables Leaps in Artificial Intelligence
      Artificial intelligence lets machines communicate autonomously
      Article 2 hours ago 3 min read In the Starlight: Jason Phillips’ Unexpected Path to Johnson Procurement
      Article 7 hours ago View the full article
    • By NASA
      Intuitive Machines’ IM-2 captured an image March 6, 2025, after landing in a crater from the Moon’s South Pole. The lunar lander is on its side near the intended landing site, Mons Mouton. In the center of the image between the two lander legs is the Polar Resources Ice Mining Experiment 1 suite, which shows the drill deployed.Intuitive Machines NASA’s PRIME-1 (Polar Resources Ice Mining Experiment 1) mission was designed to demonstrate technologies to help scientists better understand lunar resources ahead of crewed Artemis missions to the Moon. During the short-lived mission on the Moon, the performance of PRIME-1’s technology gave NASA teams reason to celebrate.  
      “The PRIME-1 mission proved that our hardware works in the harshest environment we’ve ever tested it in,” said Janine Captain, PRIME-1 co-principal investigator and research chemist at NASA’s Kennedy Space Center in Florida. “While it may not have gone exactly to plan, this is a huge step forward as we prepare to send astronauts back to the Moon and build a sustainable future there.” 
      Intuitive Machines’ IM-2 mission launched to the Moon on Feb. 26, 2025, from NASA Kennedy’s Launch Complex 39A, as part of the company’s second Moon delivery for NASA under the agency’s CLPS (Commercial Lunar Payload Services) initiative and Artemis campaign. The IM-2 Nova-C lunar lander, named Athena, carried PRIME-1 and its suite of two instruments: a drill known as TRIDENT (The Regolith and Ice Drill for Exploring New Terrain), designed to bring lunar soil to the surface; and a mass spectrometer, Mass Spectrometer Observing Lunar Operations (MSOLO), to study TRIDENT’s drill cuttings for the presence of gases that could one day help provide propellant or breathable oxygen to future Artemis explorers.  
      The IM-2 mission touched down on the lunar surface on March 6, just around 1,300 feet (400 meters) from its intended landing site of Mons Mouton, a lunar plateau near the Moon’s South Pole. The Athena lander was resting on its side inside a crater preventing it from recharging its solar cells, resulting in an end of the mission.
      “We were supposed to have 10 days of operation on the Moon, and what we got was closer to 10 hours,” said Julie Kleinhenz, NASA’s lead systems engineer for PRIME-1, as well as the in-situ resource utilization system capability lead deputy for the agency. “It was 10 hours more than most people get so I am thrilled to have been a part of it.” 
      Kleinhenz has spent nearly 20 years working on how to use lunar resources for sustained operations. In-situ resource utilization harnesses local natural resources at mission destinations. This enables fewer launches and resupply missions and significantly reduces the mass, cost, and risk of space exploration. With NASA poised to send humans back to the Moon and on to Mars, generating products for life support, propellants, construction, and energy from local materials will become increasingly important to future mission success.  
      “In-situ resource utilization is the key to unlocking long-term exploration, and PRIME-1 is helping us lay this foundation for future travelers.” Captain said.
      The PRIME-1 technology also set out to answer questions about the properties of lunar regolith, such as soil strength. This data could help inform the design of in-situ resource utilization systems that would use local resources to create everything from landing pads to rocket fuel during Artemis and later missions.  
      “Once we got to the lunar surface, TRIDENT and MSOLO both started right up, and performed perfectly. From a technology demonstrations standpoint, 100% of the instruments worked.” Kleinhenz said.
      The lightweight, low-power augering drill built by Honeybee Robotics, known as TRIDENT, is 1 meter long and features rotary and percussive actuators that convert energy into the force needed to drill. The drill was designed to stop at any depth as commanded from the ground and deposit its sample on the surface for analysis by MSOLO, a commercial off-the-shelf mass spectrometer modified by engineers and technicians at NASA Kennedy to withstand the harsh lunar environment. Designed to measure the composition of gases in the vicinity of the lunar lander, both from the lander and from the ambient exosphere, MSOLO can help NASA analyze the chemical makeup of the lunar soil and study water on the surface of the Moon.  
      Once on the Moon, the actuators on the drill performed as designed, completing multiple stages of movement necessary to drill into the lunar surface. Prompted by commands from technicians on Earth, the auger rotated, the drill extended to its full range, the percussion system performed a hammering motion, and the PRIME-1 team turned on an embedded core heater in the drill and used internal thermal sensors to monitor the temperature change.
      While MSOLO was able to perform several scans to detect gases, researchers believe from the initial data that the gases detected were all anthropogenic, or human in origin, such as gases vented from spacecraft propellants and traces of Earth water. Data from PRIME-1 accounted for some of the approximately 7.5 gigabytes of data collected during the IM-2 mission, and researchers will continue to analyze the data in the coming months and publish the results.
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Multinational corporations are using the M2M Intelligence platform in data centers and other settings. The system offers automated, secure communications on a ground-based global 5G network. Getty Images Artificial intelligence (AI) is advancing rapidly, as intelligent software proves capable of various tasks. The technology usually requires a “human in the loop” to train it and ensure accuracy. But long before the arrival of today’s generative artificial intelligence, a different kind of AI was born with the help of NASA’s Ames Research Center in California’s Silicon Valley — one that only exists between machines, running without any human intervention.

      In 2006, Geoffrey Barnard founded Machine-to-Machine Intelligence Corp. (M2Mi) at Ames’ NASA Research Park, envisioning an automated, satellite-based communication network. NASA Ames established a Space Act Agreement with the company to develop artificial intelligence that would automate communications, privacy, security, and resiliency between satellites and ground-based computers.

      Central to the technology was automating a problem-solving approach known as root cause analysis, which NASA has honed over decades. This methodology seeks to identify not only the immediate cause of a problem but also all the factors that contributed to the cause. This would allow a network to identify its own issues and fix itself. 

      NASA Ames’ director of nanotechnology at the time wanted to develop a communications network based on small, low-powered satellites, so Ames supported M2Mi in developing the necessary technology. 
      Barnard, now CEO and chief technology officer of Tiburon, California-based branch of M2Mi, said NASA’s support laid the foundation for his company, which employs the same technology in a ground-based network. 
      The company’s M2M Intelligence software performs secure, resilient, automated communications on a system that runs across hundreds of networks, connecting thousands of devices, many of which were not built to communicate with each other. The M2Mi company worked with Vodafone of Berkshire, England, to build a worldwide network across more than 500 smaller networks in over 190 countries. The companies M2M Wireless and TriGlobal have begun using M2M Intelligence for transportation logistics. 
      With NASA’s help, emerging industries are getting the boost they need to rapidly develop technologies to enhance our lives. 
      Read More Share
      Details
      Last Updated Apr 29, 2025 Related Terms
      Technology Transfer & Spinoffs Spinoffs Technology Transfer Explore More
      2 min read NASA Engineering Sparks Innovative New Battery 
      Nickel-hydrogen technology is safe, durable, and long-lasting – now it’s affordable too.
      Article 5 days ago 2 min read NASA Tech Developed for Home Health Monitoring  
      Article 3 weeks ago 2 min read NASA Cloud Software Helps Companies Find their Place in Space 
      Article 1 month ago Keep Exploring Discover Related Topics
      Missions
      Artificial Intelligence for Science
      NASA is creating artificial intelligence tools to help researchers use NASA’s science data more effectively.
      Ames Research Center
      Solar System
      View the full article
    • By NASA
      5 Min Read NASA 3D Wind Measuring Laser Aims to Improve Forecasts from Air, Space
      3D wind measurements from NASA's Aerosol Wind Profiler instrument flying on board a specially mounted aircraft along the East Coast of the U.S. and across the Great Lakes region on Oct. 15, 2024. Credits: NASA/Scientific Visualization Studio Since last fall, NASA scientists have flown an advanced 3D Doppler wind lidar instrument across the United States to collect nearly 100 hours of data — including a flight through a hurricane. The goal? To demonstrate the unique capability of the Aerosol Wind Profiler (AWP) instrument to gather extremely precise measurements of wind direction, wind speed, and aerosol concentration – all crucial elements for accurate weather forecasting.
      Weather phenomena like severe thunderstorms and hurricanes develop rapidly, so improving predictions requires more accurate wind observations.
      “There is a lack of global wind measurements above Earth’s surface,” explained Kris Bedka, the AWP principal investigator at NASA’s Langley Research Center in Hampton, Virginia. “Winds are measured by commercial aircraft as they fly to their destinations and by weather balloons launched up to twice per day from just 1,300 sites across the globe. From space, winds are estimated by tracking cloud and water vapor movement from satellite images.”
      However, in areas without clouds or where water vapor patterns cannot be easily tracked, there are typically no reliable wind measurements. The AWP instrument seeks to fill these gaps with detailed 3D wind profiles.
      The AWP instrument (foreground) and HALO instrument (background) was integrated onto the floorboard of NASA’s G-III aircraft. Kris Bedka, project principal investigator, sitting in the rear of the plane, monitored the data during a flight on Sept. 26, 2024. NASA/Maurice Cross Mounted to an aircraft with viewing ports underneath it, AWP emits 200 laser energy pulses per second that scatter and reflect off aerosol particles — such as pollution, dust, smoke, sea salt, and clouds — in the air. Aerosol and cloud particle movement causes the laser pulse wavelength to change, a concept known as the Doppler effect.
      The AWP instrument sends these pulses in two directions, oriented 90 degrees apart from each other. Combined, they create a 3D profile of wind vectors, representing both wind speed and direction.
      We are measuring winds at different altitudes in the atmosphere simultaneously with extremely high detail and accuracy.
      Kris bedka
      NASA Research Physical Scientist
      “The Aerosol Wind Profiler is able to measure wind speed and direction, but not just at one given point,” Bedka said. “Instead, we are measuring winds at different altitudes in the atmosphere simultaneously with extremely high detail and accuracy.”
      Vectors help researchers and meteorologists understand the weather, so AWP’s measurements could significantly advance weather modeling and forecasting. For this reason, the instrument was chosen to be part of the National Oceanic and Atmospheric Administration’s (NOAA) Joint Venture Program, which seeks data from new technologies that can fill gaps in current weather forecasting systems. NASA’s Weather Program also saw mutual benefit in NOAA’s investments and provided additional support to increase the return on investment for both agencies.
      On board NASA’s Gulfstream III (G-III) aircraft, AWP was paired with the agency’s High-Altitude Lidar Observatory (HALO) that measures water vapor, aerosols, and cloud properties through a combined differential absorption and high spectral resolution lidar.
      Working together for the first time, AWP measured winds, HALO collected water vapor and aerosol data, and NOAA dropsondes (small instruments dropped from a tube in the bottom of the aircraft) gathered temperature, water vapor, and wind data.
      The AWP and HALO instrument teams observing incoming data on board NASA’s G-III aircraft over Tennessee while heading south to observe Hurricane Helene. Sept. 26, 2024. NASA/Maurice Cross “With our instrument package on board small, affordable-to-operate aircraft, we have a very powerful capability,” said Bedka. “The combination of AWP and HALO is NASA’s next-generation airborne weather remote sensing package, which we hope to also fly aboard satellites to benefit everyone across the globe.”
      The combination of AWP and HALO is NASA's next-generation airborne weather remote sensing package.
      kris bedka
      NASA Research Physical Scientist
      The animation below, based on AWP data, shows the complexity and structure of aerosol layers present in the atmosphere. Current prediction models do not accurately simulate how aerosols are organized throughout the breadth of the atmosphere, said Bedka.
      To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
      This visualization shows AWP 3D measurements gathered on Oct. 15, 2024, as NASA’s G-III aircraft flew along the East Coast of the U.S. and across the Great Lakes region. Laser light that returns to AWP as backscatter from aerosol particles and clouds allows for measurement of wind direction, speed, and aerosol concentration as seen in the separation of data layers. NASA/Scientific Visualization Studio “When we took off on this particular day, I thought that we would be finding a clear atmosphere with little to no aerosol return because we were flying into what was the first real blast of cool Canadian air of the fall,” described Bedka. “What we found was quite the opposite: an aerosol-rich environment which provided excellent signal to accurately measure winds.” 
      During the Joint Venture flights, Hurricane Helene was making landfall in Florida. The AWP crew of two pilots and five science team members quickly created a flight plan to gather wind measurements along the outer bands of the severe storm.
      To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
      This video shows monitors tracking the AWP science team’s location in the western outer bands of Hurricane Helene off the coast of Florida with views outside of the aircraft looking at turbulent storm clouds on Sept. 26, 2024. NASA/Kris Bedka “A 3D wind profile can significantly improve weather forecasts, particularly for storms and hurricanes,” said Harshesh Patel, NOAA’s acting Joint Venture Program manager. “NASA Langley specializes in the development of coherent Doppler wind lidar technology and this AWP concept has potential to provide better performance for NOAA’s needs.”
      The flight plan of NASA’s G-III aircraft – outfitted with the Aerosol Wind Profiler – as it gathered data across the Southeastern U.S. and flew through portions of Hurricane Helene on Sept. 26, 2024. The flight plan is overlaid atop a NOAA Geostationary Operational Environmental Satellite-16 (GOES) satellite image from that day. NASA/John Cooney The flights of the AWP lidar are serving as a proving ground for possible integration into a future satellite mission.
      “The need to improve global 3D wind models requires a space-based platform,” added Patel. “Instruments like AWP have specific space-based applications that potentially align with NOAA’s mission to provide critical data for improving weather forecasting.”
      A view of the outer bands of Hurricane Helene off the coast of Florida during NASA’s science flights demonstrating the Aerosol Wind Profiler instrument on Sept. 26, 2024.NASA/Maurice Cross After the NOAA flights, AWP and HALO were sent to central California for the Westcoast & Heartland Hyperspectral Microwave Sensor Intensive Experiment  and the Active Passive profiling Experiment, which was supported by NASA’s Planetary Boundary Layer Decadal Survey Incubation Program and NASA Weather Programs. These missions studied atmospheric processes within the planetary boundary layer, the lowest part of the atmosphere, that drives the weather conditions we experience on the ground. 
      To learn more about lidar instruments at NASA visit:
      NASA Langley Research Center: Generations of Lidar Expertise
      About the Author
      Charles G. Hatfield
      Science Public Affairs Officer, NASA Langley Research Center
      Share
      Details
      Last Updated Apr 28, 2025 LocationNASA Langley Research Center Related Terms
      General Airborne Science Clouds Langley Research Center Explore More
      3 min read Lunar Space Station Module for NASA’s Artemis Campaign to Begin Final Outfitting
      Article 3 days ago 4 min read Navigation Technology
      Article 3 days ago 3 min read NASA Tracks Snowmelt to Improve Water Management
      Article 4 days ago Keep Exploring Discover More Topics From NASA
      Missions
      Humans in Space
      Climate Change
      Solar System
      View the full article
    • By USH
      Shape-Shifting Materials are advanced, adaptive materials capable of changing their physical form, embedding sensors and circuits directly into their structure, and even storing energy,  all without traditional wiring. Lockheed Martin is at the forefront of developing these futuristic materials, raising questions about the possible extraterrestrial origin of this technology. 

      In a previous article, we discussed why suppressed exotic technologies are suddenly being disclosed. One company that frequently comes up in this conversation is Lockheed Martin, the American defense and aerospace giant known for pushing the boundaries of aviation and space innovation. 
      Imagine an aircraft that can grow its own skin, embed sensors into its body, store energy without wires, and even shift its shape mid-flight to adapt to changing conditions. This isn’t science fiction anymore, Lockheed Martin’s cutting-edge research is turning these futuristic concepts into reality. 
      But where is all this coming from? 
      The rapid development and creativity behind Lockheed Martin’s projects raise intriguing questions. Whistleblowers like David Grusch have recently alleged that Lockheed Martin has had access to recovered UFO materials for decades. Supporting this, Don Phillips,  a former Lockheed engineer,  confirmed years ago that exotic materials have been held and studied by the company since at least the 1950s. 
      This suggests that for over half a century, Lockheed has secretly been engaged in researching and reverse-engineering off-world technologies. It's possible that the breakthroughs we’re seeing today are the result of this hidden legacy. Ben Rich, former head of Lockheed’s Skunk Works division, famously hinted at this when he said, "We now have the technology to take ET home." 
      One particularly stunning development involves "smart" materials that behave almost like muscles, allowing aircraft structures to morph in real-time. These materials enable a craft to fine-tune its aerodynamics on the fly, adjusting instantly to turbulence, speed shifts, or mission-specific demands. 
      Lockheed’s innovations go even further. By embedding carbon nanotubes, extremely strong and highly conductive microscopic structure, directly into the material, they have created surfaces that can transfer information and power without traditional wiring. In these next-generation aircraft, the "skin" itself acts as the nervous system, the energy grid, and the sensor network all at once. 
      You can only imagine the kinds of technologies that have been developed over the years through the reverse engineering of exotic materials and recovered extraterrestrial craft. Yet, governments and space agencies remain tight-lipped about the existence of advanced alien civilizations, who likely introduced these techniques to Earth unintentionally.
        View the full article
  • Check out these Videos

×
×
  • Create New...