Members Can Post Anonymously On This Site
PRIME-1 Simulation
-
Similar Topics
-
By NASA
A view inside the sandbox portion of the Crew Health and Performance Analog, where research volunteers participate in simulated walks on the surface of Mars. Credit: NASA Four research volunteers will soon participate in NASA’s year-long simulation of a Mars mission inside a habitat at the agency’s Johnson Space Center in Houston. This mission will provide NASA with foundational data to inform human exploration of the Moon, Mars, and beyond.
Ross Elder, Ellen Ellis, Matthew Montgomery, and James Spicer enter into the 1,700-square-foot Mars Dune Alpha habitat on Sunday, Oct. 19, to begin their mission. The team will live and work like astronauts for 378 days, concluding their mission on Oct. 31, 2026. Emily Phillips and Laura Marie serve as the mission’s alternate crew members.
Through a series of Earth-based missions called CHAPEA (Crew Health and Performance Exploration Analog), carried out in the 3D-printed habitat, NASA aims to evaluate certain human health and performance factors ahead of future Mars missions. The crew will undergo realistic resource limitations, equipment failures, communication delays, isolation and confinement, and other stressors, along with simulated high-tempo extravehicular activities. These scenarios allow NASA to make informed trades between risks and interventions for long-duration exploration missions.
“As NASA gears up for crewed Artemis missions, CHAPEA and other ground analogs are helping to determine which capabilities could best support future crews in overcoming the human health and performance challenges of living and operating beyond Earth’s resources – all before we send humans to Mars,” said Sara Whiting, project scientist with NASA’s Human Research Program at NASA Johnson.
Crew members will carry out scientific research and operational tasks, including simulated Mars walks, growing a vegetable garden, robotic operations, and more. Technologies specifically designed for Mars and deep space exploration will also be tested, including a potable water dispenser and diagnostic medical equipment.
“The simulation will allow us to collect cognitive and physical performance data to give us more insight into the potential impacts of the resource restrictions and long-duration missions to Mars on crew health and performance,” said Grace Douglas, CHAPEA principal investigator. “Ultimately, this information will help NASA make informed decisions to design and plan for a successful human mission to Mars.”
This mission, facilitated by NASA’s Human Research Program, is the second one-year Mars surface simulation conducted through CHAPEA. The first mission concluded on July 6, 2024.
The Human Research Program pursues methods and technologies to support safe, productive human space travel. Through applied research conducted in laboratories, simulations, and aboard the International Space Station, the program investigates the effects spaceflight has on human bodies and behaviors to keep astronauts healthy and mission-ready.
Primary Crew
Ross Elder, Commander
Ross Elder, from Williamstown, West Virginia, is a major and experimental test pilot in the United States Air Force. At the time of his selection, he served as the director of operations of the 461st Flight Test Squadron. He has piloted over 35 military aircraft and accumulated more than 1,800 flying hours, including 200 combat hours, primarily in the F-35, F-15E/EX, F-16, and A-10C. His flight test experience focuses on envelope expansion, crewed-uncrewed teaming, artificial intelligence, autonomy, mission systems, and weapons modernization.
Elder earned a Bachelor of Science in astronautical engineering from the U.S. Air Force Academy in Colorado Springs, Colorado, and commissioned as an Air Force officer upon graduation. He earned a Master of Science in mechanical engineering from the University of Colorado in Colorado Springs and a master’s degree in flight test engineering from the U.S. Air Force Test Pilot School at Edwards Air Force Base in California.
Ellen Ellis, Medical Officer
Ellen Ellis, from North Kingstown, Rhode Island, is a colonel and an acquisitions officer in the United States Space Force. She currently serves as a senior materiel leader in the National Reconnaissance Office (NRO) Communications Systems Directorate. She is responsible for fielding commercial cloud and traditional information technology hosting solutions and building modernized data centers for the NRO. She previously served as an Intercontinental Ballistic Missile operations officer and GPS satellite engineer, and she also developed geospatial intelligence payloads and ground processing systems.
She earned a Bachelor of Science in aerospace engineering at Syracuse University in New York and holds four master’s degrees, including a Master of Science in systems engineering from the Naval Postgraduate School in California, and a Master of Science in emergency and disaster management from Georgetown University in Washington.
Matthew Montgomery, Science Officer
Matthew Montgomery, from Los Angeles, is a hardware engineering design consultant who works with technology startup companies to develop, commercialize, and scale their products. His focus areas include LED lighting, robotics, controlled environment agriculture, and embedded control systems.
Montgomery earned a Bachelor of Science and a Master of Science in electrical engineering from the University of Central Florida. He is also a founder and co-owner of Floating Lava Studios, a film production company based in Los Angeles.
James Spicer, Flight Engineer
James Spicer is a technical director in the aerospace and defense industry. His experience includes building radio and optical satellite communications networks; space data relay networks for human spaceflight; position, navigation, and timing research; and hands-on spacecraft design, integration, and tests.
Spicer earned a Bachelor of Science and Master of Science in aeronautics and astronautics, and holds a Notation in Science Communication from Stanford University in California. He also holds commercial pilot and glider pilot licenses.
Alternate Crew
Emily Phillips
Emily Phillips, from Waynesburg, Pennsylvania, is a captain and pilot in the United States Marine Corps. She currently serves as a forward air controller and air officer attached to an infantry battalion stationed at the Marine Corps Air Ground Combat Center in Twentynine Palms, California.
Phillips earned a Bachelor of Science in computer science from the U.S. Naval Academy in Annapolis and commissioned as a Marine Corps officer upon graduation. She attended flight school, earning her Naval Aviator wings and qualifying as an F/A-18C Hornet pilot. Phillips has completed multiple deployments to Europe and Southeast Asia.
Laura Marie
Born in the United Kingdom, Laura Marie immigrated to the U.S. in 2016. She is a commercial airline pilot specializing in flight safety, currently operating passenger flights in Washington.
Marie began her aviation career in 2019 and has amassed over 2,800 flight hours. She holds a Bachelor of Arts in philosophy and a Master of Science in aeronautics from Liberty University in Lynchburg, Virginia. In addition to her Airline Transport Pilot License, she also possesses flight instructor and advanced ground instructor licenses. Outside the flight deck, Marie dedicates her time to mentoring and supporting aspiring pilots as they navigate their careers.
Explore More
4 min read NASA Glenn Tests Mini-X-Ray Technology to Advance Space Health Care
Article 1 day ago 4 min read NASA’s SpaceX Crew-11 to Support Health Studies for Deep Space Travel
Article 2 months ago 2 min read What Are the Dangers of Going to Space? We Asked a NASA Expert: Episode 55
Article 5 months ago Keep Exploring Discover More Topics From NASA
Living in Space
Artemis
Human Research Program
Space Station Research and Technology
View the full article
-
By NASA
Damian Hischier of the National Test Pilot School in Mojave, California, takes part in testing of a virtual reality-infused pilot simulation in the Vertical Motion Simulator (VMS) at NASA’s Ames Research Center in California’s Silicon Valley on May 30, 2025. NASA/Brandon Torres-Navarrete Commercial companies and government agencies are increasingly pursuing a more immersive and affordable alternative to conventional displays currently used in flight simulators. A NASA research project is working on ways to make this technology available for use faster.
Mixed reality systems where users interact with physical simulators while wearing virtual reality headsets offer a promising path forward for pilot training. But currently, only limited standards exist for allowing their use, as regulators have little to no data on how these systems perform. To address this, NASA’s Ames Research Center in California’s Silicon Valley invited a dozen pilots to participate in a study to test how a mixed-reality flight simulation would perform in the world’s largest flight simulator.
“For the first time, we’re collecting real data on how this type of mixed reality simulation performs in the highest-fidelity vertical motion simulator,” said Peter Zaal, a principal systems architect at Ames. “The more we understand about how these systems affect pilot performance, the closer we are to providing a safer, cost-effective training tool to the aviation community that could benefit everyone from commercial airlines to future air taxi operators.”
A National Test Pilot student observes the mixed-reality pilot simulation in the VMS at Ames on May 30, 2025.NASA/Brandon Torres-Navarrete Mixed reality blends physical and digital worlds, allowing users to see physical items while viewing a desired simulated environment. Flight simulators employing this technology through headset or a similar setup could offer pilots training for operating next-generation aircraft at a reduced cost and within a smaller footprint compared to more traditional flight simulators. This is because pilots could rely more heavily on the visuals provided through the headset instead of large embedded visual displays in a physical motion simulator.
During the testing – which ran May 23-30 – pilots donned a headset through which they could see the physical displays and control sticks inside the Vertical Motion Simulator (VMS) cab along with a virtual cockpit overlay of an electric vertical take-off and landing vehicle through the head-mounted display. When the pilots looked toward their windscreens, they saw a virtual view of San Francisco and the surrounding area.
Pilots performed three typical flight maneuvers under four sets of motion conditions. Afterward, they were asked to provide feedback on their level of motion sickness while using the head-mounted display and how well the simulator replicated the same movements the aircraft would make during a real flight.
An initial analysis of the study shows pilots reported lower ratings of motion sickness than NASA researchers expected. Many shared that the mixed-reality setup inside the VMS felt more realistic and fluid than previous simulator setups they had tested.
As part of the test, Ames hosted members of the Federal Aviation Administration Civil Aerospace Medical Institute, which studies factors that influence human performance in aerospace. Pilots from the National Test Pilot School attended a portion of the testing and, independent from the study, evaluated the head-mounted display’s “usable cue environment,” or representation of the visual cues pilots rely on to control an aircraft.
Peter Zaal (left), observes as Samuel Ortho (middle) speaks with a National Test Pilot student during the mixed reality pilot simulation in the Vertical Motion Simulator at Ames on May 30, 2025. NASA will make the test results available to the public and the aviation community early next year. This first-of-its-kind testing – funded by an Ames Innovation Fair Grant and managed by the center’s Aviation Systems Division – paves the way for potential use of this technology in the VMS for future aviation and space missions.
View the full article
-
By NASA
Credit: NASA NASA has awarded a contract to MacLean Engineering & Applied Technologies, LLC of Houston to provide simulation and advanced software services to the agency.
The Simulation and Advanced Software Services II (SASS II) contract includes services from Oct. 1, 2025, through Sept. 30, 2030, with a maximum potential value not to exceed $150 million. The contract is a single award, indefinite-delivery/indefinite-quality contract with the capability to issue cost-plus-fixed-fee task orders and firm-fixed-price task orders.
Under the five-year SASS II contract, the awardee is tasked to provide simulation and software services for space-based vehicle models and robotic manipulator systems; human biomechanical representations for analysis and development of countermeasures devices; guidance, navigation, and control of space-based vehicles for all flight phases; and space-based vehicle on-board computer systems simulations of flight software systems. Responsibilities also include astronomical object surface interaction simulation of space-based vehicles, graphics support for simulation visualization and engineering analysis, and ground-based and onboarding systems to support human-in-the-loop training.
Major subcontractors include Tietronix Software Inc. in Houston and VEDO Systems, LLC, in League City, Texas.
For information about NASA and agency programs, visit:
https://www.nasa.gov/
-end-
Tiernan Doyle
Headquarters, Washington
202-358-1600
tiernan.doyle@nasa.gov
Chelsey Ballarte
Johnson Space Center, Houston
281-483-5111
Chelsey.n.ballarte@nasa.gov
Share
Details
Last Updated Jul 02, 2025 LocationNASA Headquarters Related Terms
Technology Johnson Space Center View the full article
-
By NASA
If you design a new tool for use on Earth, it is easy to test and practice using that tool in its intended environment. But what if that tool is destined for lunar orbit or will be used by astronauts on the surface of the Moon?
NASA’s Simulation and Graphics Branch can help with that. Based at Johnson Space Center in Houston, the branch’s high-fidelity, real-time graphical simulations support in-depth engineering analyses and crew training, ensuring the safety, efficiency, and success of complex space endeavors before execution. The team manages multiple facilities that provide these simulations, including the Prototype Immersive Technologies (PIT) Lab, Virtual Reality Training Lab, and the Systems Engineering Simulator (SES).
Lee Bingham is an aerospace engineer on the simulation and graphics team. His work includes developing simulations and visualizations for the NASA Exploration Systems Simulations team and providing technical guidance on simulation and graphics integration for branch-managed facilities. He also leads the branch’s human-in-the-loop Test Sim and Graphics Team, the Digital Lunar Exploration Sites Unreal Simulation Tool (DUST), and the Lunar Surface Mixed-Reality with the Active Response Gravity Offload System (ARGOS) projects.
Lee Bingham demonstrates a spacewalk simulator for the Gateway lunar space station during NASA’s Tech Day on Capitol Hill in Washington, D.C. Image courtesy of Lee Bingham Bingham is particularly proud of his contributions to DUST, which provides a 3D visualization of the Moon’s South Pole and received Johnson’s Exceptional Software of the Year Award in 2024. “It was designed for use as an early reference to enable candidate vendors to perform initial studies of the lunar terrain and lighting in support of the Strategy and Architecture Office, human landing system, and the Extravehicular Activity and Human Surface Mobility Program,” Bingham explained. DUST has supported several human-in-the-loop studies for NASA. It has also been shared with external collaborators and made available to the public through the NASA Software Catalog.
Bingham has kept busy during his nearly nine years at Johnson and said learning to manage and balance support for multiple projects and customers was very challenging at first. “I would say ‘yes’ to pretty much anything anyone asked me to do and would end up burning myself out by working extra-long hours to meet milestones and deliverables,” he said. “It has been important to maintain a good work-life balance and avoid overcommitting myself while meeting demanding expectations.”
Lee Bingham tests the Lunar Surface Mixed Reality and Active Response Gravity Offload System trainer at Johnson Space Center. Image courtesy of Lee Bingham Bingham has also learned the importance of teamwork and collaboration. “You can’t be an expert at everything or do everything yourself,” he said. “Develop your skills, practice them regularly, and master them over time but be willing to ask for help and advice. And be sure to recognize and acknowledge your coworkers and teammates when they go above and beyond or achieve something remarkable.”
Lee Bingham (left) demonstrates a lunar rover simulator for Apollo 16 Lunar Module Pilot Charlie Duke. Image courtesy of Lee Bingham He hopes that the Artemis Generation will be motivated to tackle difficult challenges and further NASA’s mission to benefit humanity. “Be sure to learn from those who came before you, but be bold and unafraid to innovate,” he advised.
View the full article
-
By European Space Agency
Video: 00:02:43 On 12 March 2025 ESA’s Hera spacecraft for planetary defence performs a flyby of Mars. The gravity of the red planet shifts the spacecraft’s trajectory towards the Didymos binary asteroid system, shortening its trip by months and saving substantial fuel.
This is a simulation of that flyby, sped up 500 times, with closest approach to Martian moon Deimos taking place at 12:07 GMT and Mars occurring at 12:51 GMT. It was made using SPICE (Spacecraft, Planet, Instrument, C-matrix, Events) software. Produced by a team at ESA’s ESAC European Space Astronomy Centre, this SPICE visualisation is used to plan instrument acquisitions during Hera’s flyby.
Hera comes to around 5000 km from the surface of Mars during its flyby. It will also image Deimos, the smaller of Mars’s two moons, from a minimum 1000 km away (while venturing as close as 300 km). Hera will also image Mars’s larger moon Phobos as it begins to move away from Mars. In this sped-up simulation, Deimos is seen 30 seconds in, at 12:07 GMT, while the more distant star-like Phobos becomes visible at two minutes in, at 12:49 GMT.
The spacecraft employs three of its instruments over the course of these close encounters, all located together on the ‘Asteroid Deck’ on top of Hera:
Hera’s Asteroid Framing Camera is formed of two redundant 1020x1020 pixel monochromatic visible light cameras, used for both navigation and science.
The Thermal Infrared Imager, supplied by the Japanese Aerospace Exploration Agency, JAXA, images at mid-infrared wavelengths to determine surface temperatures.
Hera’s Hyperscout H is a hyperspectral imager, observing in 25 visible and near-infrared spectral bands to prospect surface minerals.
Did you know this mission has its own AI? You can pose questions to our Hera Space Companion!
View the full article
-
-
Check out these Videos
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.