Jump to content

Recommended Posts

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      While on tour at NASA’s Glenn Research Center in Cleveland on Monday, June 23, 2025, University Student Design Challenge winners from The Ohio State University stop to hear engineer Nancy Hall, center, discuss different parts of a sealed vessel used in research and development activities focused on nanotechnology and nanomaterials. Credit: NASA/Jef Janis 
        A student team from The Ohio State University secured first place in NASA Glenn Research Center’s 2025-2026 University Student Design Challenge for their innovative design aimed at managing fluids in space. The team will develop a working prototype as part of their senior capstone project during the upcoming academic year. 
      On June 23, the team visited NASA Glenn in Cleveland to present their winning designs to center leadership and tour the Zero Gravity Research Facility, where their design could undergo future testing. The challenge encourages college students to develop innovative approaches to NASA mission needs, featuring both aeronautics and space-themed projects.  
      University Student Design Challenge winners from The Ohio State University gather at the top of the Zero Gravity Drop Tower at NASA’s Glenn Research Center in Cleveland on Monday, June 23, 2025. Credit: NASA/Jef Janis  NASA Glenn engineers Nancy Hall and John McQuillan served as student mentors and technical advisors for the USDC SPACE I design challenge. 
      To learn more, explore NASA’s STEM opportunities.  

      Return to Newsletter View the full article
    • By USH
      In March 2025, a perfectly smooth metallic sphere crashed near the city of Buga, Colombia, setting in motion a chain of revelations that could rewrite the story of human history. Weighing just 4.5 pounds, the object has no visible seams, joints, or welds. It remains icy cold to the touch and shows no sign of conventional propulsion or manufacturing methods known to science. 
      Buga Sphere
      Its surface is etched with intricate markings eerily similar to symbols from ancient Mesopotamia, as well as other civilizations separated by oceans and thousands of years. AI-assisted analysis suggests the glyphs carry profound themes—unity, transformation, and the origins of consciousness, concepts that cannot easily be reconciled within the framework of standard physics. 
      Advanced scans have revealed hidden internal structures and an unusually dense core. Even more unsettling, researchers have detected the sphere emitting very low frequency (VLF) and low frequency (LF) radio waves—signals capable of traveling hundreds of kilometers over terrain and far beyond the horizon, often used in navigation, communications, and precise timing synchronization. 
      Whispers are now spreading about the discovery of a second, even older sphere, quietly stored in a forgotten museum collection. Meanwhile, the glyphs on the Buga sphere appear to be slowly evolving, forming what some believe are coordinates pointing toward remote and mysterious sites: deep within the Amazon, along the shores of Lake Titicaca, and in the highlands of Peru. 
      This has led to a question, is it just an elaborate hoax or are these spheres fragments of a hidden planetary network, and if so… what happens when it awakens?
        View the full article
    • By NASA
      NASA named Stanford University of California winner of the Lunar Autonomy Challenge, a six-month competition for U.S. college and university student teams to virtually map and explore using a digital twin of NASA’s In-Situ Resource Utilization Pilot Excavator (IPEx). 
      The winning team successfully demonstrated the design and functionality of their autonomous agent, or software that performs specified actions without human intervention. Their agent autonomously navigated the IPEx digital twin in the virtual lunar environment, while accurately mapping the surface, correctly identifying obstacles, and effectively managing available power.
      Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Lunar simulation developed by the winning team of the Lunar Autonomy Challenge’s first place team from Stanford University.Credit: Stanford University’s NAV Lab team Team photo of NAV Lab Lunar Autonomy Challenge from Stanford UniversityCredit: Stanford University’s NAV Lab team The Lunar Autonomy Challenge has been a truly unique experience. The challenge provided the opportunity to develop and test methods in a highly realistic simulation environment."
      Adam dai
      Lunar Autonomy Challenge team lead, Stanford University

      Dai added, “It pushed us to find solutions robust to the harsh conditions of the lunar surface. I learned so much through the challenge, both about new ideas and methods, as well as through deepening my understanding of core methods across the autonomy stack (perception, localization, mapping, planning). I also very much enjoyed working together with my team to brainstorm different approaches and strategies and solve tangible problems observed in the simulation.” 
      The challenge offered 31 teams a valuable opportunity to gain experience in software development, autonomy, and machine learning using cutting-edge NASA lunar technology. Participants also applied essential skills common to nearly every engineering discipline, including technical writing, collaborative teamwork, and project management.
      The Lunar Autonomy Challenge supports NASA’s Lunar Surface Innovation Initiative (LSII), which is part of the Space Technology Mission Directorate. The LSII aims to accelerate technology development and pursue results that will provide essential infrastructure for lunar exploration by collaborating with industry, academia, and other government agencies.
      The work displayed by all of these teams has been impressive, and the solutions they have developed are beneficial to advancing lunar and Mars surface technologies as we prepare for increasingly complex missions farther from home.” 
      Niki Werkheiser
      Director of Technology Maturation and LSII lead, NASA Headquarters
      “To succeed, we need input from everyone — every idea counts to propel our goals forward. It is very rewarding to see these students and software developers contributing their skills to future lunar and Mars missions,” Werkheiser added.  
      Through the Lunar Autonomy Challenge, NASA collaborated with the Johns Hopkins Applied Physics Laboratory, Caterpillar Inc., and Embodied AI. Each team contributed unique expertise and tools necessary to make the challenge a success.
      The Applied Physics Laboratory managed the challenge for NASA. As a systems integrator for LSII, they provided expertise to streamline rigor and engineering discipline across efforts, ensuring the development of successful, efficient, and cost-effective missions — backed by the world’s largest cohort of lunar scientists. 
      Caterpillar Inc. is known for its construction and excavation equipment and operates a large fleet of autonomous haul trucks. They also have worked with NASA for more than 20 years on a variety of technologies, including autonomy, 3D printing, robotics, and simulators as they continue to collaborate with NASA on technologies that support NASA’s mission objectives and provide value to the mining and construction industries. 
      Embodied AI collaborated with Caterpillar to integrate the simulation into the open-source  driving environment used for the challenge. For the Lunar Autonomy Challenge, the normally available digital assets of the CARLA simulation platform, such as urban layouts, buildings, and vehicles, were replaced by an IPEx “Digital Twin” and lunar environmental models.
      “This collaboration is a great example of how the government, large companies, small businesses, and research institutions can thoughtfully leverage each other’s different, but complementary, strengths,” Werkheiser added. “By substantially modernizing existing tools, we can turn today’s novel technologies into tomorrow’s institutional capabilities for more efficient and effective space exploration, while also stimulating innovation and economic growth on Earth.”

      FINALIST TEAMS
      First Place
      NAV Lab team
      Stanford University, Stanford, California


      Second Place
      MAPLE (MIT Autonomous Pathfinding for Lunar Exploration) team
      Massachusetts Institute of Technology, Cambridge, MA


      Third Place
      Moonlight team
      Carnegie Mellon University, Pittsburgh, PA
      OTHER COMPETING TEAMS
      Lunar ExplorersArizona State UniversityTempe, ArizonaAIWVU West Virginia University Morgantown, West VirginiaStellar Sparks California Polytechnic Institute Pomona Pomona, California LunatiX Johns Hopkins University Whiting School of EngineeringBaltimore CARLA CSU California State University, Stanislaus Turlock, CaliforniaRose-Hulman Rose-Hulman Institute of Technology Terre Haute, IndianaLunar PathfindersAmerican Public University SystemCharles Town, West Virginia Lunar Autonomy Challenge digital simulation of lunar surface activity using a digital twin of NASA’s ISRU Pilot ExcavatorJohns Hopkins Applied Physics Laboratory Keep Exploring Discover More Topics From NASA
      Space Technology Mission Directorate
      NASA’s Lunar Surface Innovation Initiative
      Game Changing Development Projects
      Game Changing Development projects aim to advance space technologies, focusing on advancing capabilities for going to and living in space.
      ISRU Pilot Excavator
      View the full article
    • By NASA
      4 min read
      May’s Night Sky Notes: How Do We Find Exoplanets?
      Astronomers have been trying to discover evidence that worlds exist around stars other than our Sun since the 19th century. By the mid-1990s, technology finally caught up with the desire for discovery and led to the first discovery of a planet orbiting another sun-like star, Pegasi 51b. Why did it take so long to discover these distant worlds, and what techniques do astronomers use to find them?
      The Transit Method
      A planet passing in front of its parent star creates a drop in the star’s apparent brightness, called a transit. Exoplanet Watch participants can look for transits in data from ground-based telescopes, helping scientists refine measurements of the length of a planet’s orbit around its star. Credit: NASA’s Ames Research Center One of the most famous exoplanet detection methods is the transit method, used by Kepler and other observatories. When a planet crosses in front of its host star, the light from the star dips slightly in brightness. Scientists can confirm a planet orbits its host star by repeatedly detecting these incredibly tiny dips in brightness using sensitive instruments. If you can imagine trying to detect the dip in light from a massive searchlight when an ant crosses in front of it, at a distance of tens of miles away, you can begin to see how difficult it can be to spot a planet from light-years away! Another drawback to the transit method is that the distant solar system must be at a favorable angle to our point of view here on Earth – if the distant system’s angle is just slightly askew, there will be no transits. Even in our solar system, a transit is very rare. For example, there were two transits of Venus visible across our Sun from Earth in this century. But the next time Venus transits the Sun as seen from Earth will be in the year 2117 – more than a century from the 2012 transit, even though Venus will have completed nearly 150 orbits around the Sun by then!
      The Wobble Method
      As a planet orbits a star, the star wobbles. This causes a change in the appearance of the star’s spectrum called Doppler shift. Because the change in wavelength is directly related to relative speed, astronomers can use Doppler shift to calculate exactly how fast an object is moving toward or away from us. Astronomers can also track the Doppler shift of a star over time to estimate the mass of the planet orbiting it. NASA, ESA, CSA, Leah Hustak (STScI) Spotting the Doppler shift of a star’s spectra was used to find Pegasi 51b, the first planet detected around a Sun-like star. This technique is called the radial velocity or “wobble” method. Astronomers split up the visible light emitted by a star into a rainbow. These spectra, and gaps between the normally smooth bands of light, help determine the elements that make up the star. However, if there is a planet orbiting the star, it causes the star to wobble ever so slightly back and forth. This will, in turn, cause the lines within the spectra to shift ever so slightly towards the blue and red ends of the spectrum as the star wobbles slightly away and towards us. This is caused by the blue and red shifts of the star’s light. By carefully measuring the amount of shift in the star’s spectra, astronomers can determine the size of the object pulling on the host star and if the companion is indeed a planet. By tracking the variation in this periodic shift of the spectra, they can also determine the time it takes the planet to orbit its parent star.
      Direct Imaging
      Finally, exoplanets can be revealed by directly imaging them, such as this image of four planets found orbiting the star HR 8799! Space telescopes use instruments called coronagraphs to block the bright light from the host star and capture the dim light from planets. The Hubble Space Telescope has captured images of giant planets orbiting a few nearby systems, and the James Webb Space Telescope has only improved on these observations by uncovering more details, such as the colors and spectra of exoplanet atmospheres, temperatures, detecting potential exomoons, and even scanning atmospheres for potential biosignatures!
      NASA’s James Webb Space Telescope has provided the clearest look in the infrared yet at the iconic multi-planet system HR 8799. The closest planet to the star, HR 8799 e, orbits 1.5 billion miles from its star, which in our solar system would be located between the orbit of Saturn and Neptune. The furthest, HR 8799 b, orbits around 6.3 billion miles from the star, more than twice Neptune’s orbital distance. Colors are applied to filters from Webb’s NIRCam (Near-Infrared Camera), revealing their intrinsic differences. A star symbol marks the location of the host star HR 8799, whose light has been blocked by the coronagraph. In this image, the color blue is assigned to 4.1 micron light, green to 4.3 micron light, and red to the 4.6 micron light. NASA, ESA, CSA, STScI, W. Balmer (JHU), L. Pueyo (STScI), M. Perrin (STScI) You can find more information and activities on NASA’s Exoplanets page, such as the Eyes on Exoplanets browser-based program, The Exoplaneteers, and some of the latest exoplanet news. Lastly, you can find more resources in our News & Resources section, including a clever demo on how astronomers use the wobble method to detect planets! 
      The future of exoplanet discovery is only just beginning, promising rich rewards in humanity’s understanding of our place in the Universe, where we are from, and if there is life elsewhere in our cosmos.
      Originally posted by Dave Prosper: July 2015
      Last Updated by Kat Troche: April 2025
      View the full article
    • By NASA
      4 min read
      Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data
      PRISM’s platform uses AI segmentation to identify and highlight residential structures in a neighborhood. NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups to showcase innovative ideas and technologies with the potential to advance the agency’s science goals. To potentially leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Challenge winners were awarded prize money, and in 2023 the total Entrepreneurs Challenge prize value was $1M. Numerous challenge winners have subsequently refined their products and/or received funding from NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.
      One 2023 Entrepreneurs Challenge winner, PRISM Intelligence (formerly known as Pegasus Intelligence and Space), is using artificial intelligence (AI) and other advances in computer vision to create a new platform that could provide geospatial insights to a broad community.
      Every day, vast amounts of remote sensing data are collected through satellites, drones, and aerial imagery, but for most businesses and individuals, accessing and extracting meaningful insights from this data is nearly impossible.  
      The company’s product—Personal Real-time Insight from Spatial Maps, a.k.a. PRISM—is transforming geospatial data into an easy-to-navigate, queryable world. By leveraging 3D computer vision, geospatial analytics, and AI-driven insights, PRISM creates photorealistic, up-to-date digital environments that anyone can interact with. Users can simply log in and ask natural-language questions to instantly retrieve insights—no advanced Geographic Information System (GIS) expertise is required.
      For example, a pool cleaner looking for business could use PRISM to search for all residential pools in a five-mile radius. A gardener could identify overgrown trees in a community. City officials could search for potholes in their jurisdiction to prioritize repairs, enhance public safety, and mitigate liability risks. This broad level of accessibility brings geospatial intelligence out of the hands of a few and into everyday decision making.
      The core of PRISM’s platform uses radiance fields to convert raw 2D imagery into high-fidelity, dynamic 3D visualizations. These models are then enhanced with AI-powered segmentation, which autonomously identifies and labels objects in the environment—such as roads, vehicles, buildings, and natural features—allowing for seamless search and analysis. The integration of machine learning enables PRISM to refine its reconstructions continuously, improving precision with each dataset. This advanced processing ensures that the platform remains scalable, efficient, and adaptable to various data sources, making it possible to produce large-scale, real-time digital twins of the physical world.
      The PRISM platform’s interface showcasing a 3D digital twin of California State Polytechnic University, Pomona, with AI-powered search and insights. “It’s great being able to push the state of the art in this relatively new domain of radiance fields, evolving it from research to applications that can impact common tasks. From large sets of images, PRISM creates detailed 3D captures that embed more information than the source pictures.” — Maximum Wilder-Smith, Chief Technology Officer, PRISM Intelligence
      Currently the PRISM platform uses proprietary data gathered from aerial imagery over selected areas. PRISM then generates high-resolution digital twins of cities in select regions. The team is aiming to eventually expand the platform to use NASA Earth science data and commercial data, which will enable high-resolution data capture over larger areas, significantly increasing efficiency, coverage, and update frequency. PRISM aims to use the detailed multiband imagery that NASA provides and the high-frequency data that commercial companies provide to make geospatial intelligence more accessible by providing fast, reliable, and up-to-date insights that can be used across multiple industries.
      What sets PRISM apart is its focus on usability. While traditional GIS platforms require specialized training to use, PRISM eliminates these barriers by allowing users to interact with geospatial data through a frictionless, conversational interface.
      The impact of this technology could extend across multiple industries. Professionals in the insurance and appraisal industries have informed the company how the ability to generate precise, 3D assessments of properties could streamline risk evaluations, reduce costs, and improve accuracy—replacing outdated or manual site visits. Similarly, local governments have indicated they could potentially use PRISM to better manage infrastructure, track zoning compliance, and allocate resources based on real-time, high-resolution urban insights. Additionally, scientists could use the consistent updates and layers of three-dimensional data that PRISM can provide to better understand changes to ecosystems and vegetation.
      As PRISM moves forward, the team’s focus remains on scaling its capabilities and expanding its applications. Currently, the team is working to enhance the technical performance of the platform while also adding data sources to enable coverage of more regions. Future iterations will further improve automation of data processing, increasing the speed and efficiency of real-time 3D reconstructions. The team’s goal is to expand access to geospatial insights, ensuring that anyone—from city planners to business owners—can make informed decisions using the best possible data.
      PRISM Intelligence founders Zachary Gaines, Hugo Delgado, and Maximum Wilder-Smith in their California State Polytechnic University, Pomona lab, where the company was first formed. Share








      Details
      Last Updated Apr 21, 2025 Related Terms
      Earth Science Division Earth Science Science-enabling Technology Technology Highlights Explore More
      4 min read NASA Aims to Fly First Quantum Sensor for Gravity Measurements


      Article


      7 days ago
      4 min read GLOBE Mission Earth Supports Career Technical Education


      Article


      2 weeks ago
      4 min read New York Math Teacher Measures Trees & Grows Scientists with GLOBE


      Article


      2 weeks ago
      View the full article
  • Check out these Videos

×
×
  • Create New...