Jump to content

The Davis Wilson Notes: Secret reverse-engineering program pertaining to extraterrestrial technology


Recommended Posts

Posted
There has been renewed discussion lately regarding the Davis-Wilson notes, which detail a meeting held in 2002 that discussed a deeply secret reverse-engineering program pertaining to extraterrestrial technology. 

davis%20wilson%20ufo%20notes.jpg

Richard Dolan discusses the latest developments while also reviewing the notes themselves, his own personal connection to them, the arguments supporting their authenticity, and most importantly the implications. 

Link to the Davis Wilson Notes: https://www.congress.gov/117/meeting/house/114761/documents/HHRG-117-IG05-20220517-SD001.pdf

 

View the full article

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By NASA
      4 min read
      May’s Night Sky Notes: How Do We Find Exoplanets?
      Astronomers have been trying to discover evidence that worlds exist around stars other than our Sun since the 19th century. By the mid-1990s, technology finally caught up with the desire for discovery and led to the first discovery of a planet orbiting another sun-like star, Pegasi 51b. Why did it take so long to discover these distant worlds, and what techniques do astronomers use to find them?
      The Transit Method
      A planet passing in front of its parent star creates a drop in the star’s apparent brightness, called a transit. Exoplanet Watch participants can look for transits in data from ground-based telescopes, helping scientists refine measurements of the length of a planet’s orbit around its star. Credit: NASA’s Ames Research Center One of the most famous exoplanet detection methods is the transit method, used by Kepler and other observatories. When a planet crosses in front of its host star, the light from the star dips slightly in brightness. Scientists can confirm a planet orbits its host star by repeatedly detecting these incredibly tiny dips in brightness using sensitive instruments. If you can imagine trying to detect the dip in light from a massive searchlight when an ant crosses in front of it, at a distance of tens of miles away, you can begin to see how difficult it can be to spot a planet from light-years away! Another drawback to the transit method is that the distant solar system must be at a favorable angle to our point of view here on Earth – if the distant system’s angle is just slightly askew, there will be no transits. Even in our solar system, a transit is very rare. For example, there were two transits of Venus visible across our Sun from Earth in this century. But the next time Venus transits the Sun as seen from Earth will be in the year 2117 – more than a century from the 2012 transit, even though Venus will have completed nearly 150 orbits around the Sun by then!
      The Wobble Method
      As a planet orbits a star, the star wobbles. This causes a change in the appearance of the star’s spectrum called Doppler shift. Because the change in wavelength is directly related to relative speed, astronomers can use Doppler shift to calculate exactly how fast an object is moving toward or away from us. Astronomers can also track the Doppler shift of a star over time to estimate the mass of the planet orbiting it. NASA, ESA, CSA, Leah Hustak (STScI) Spotting the Doppler shift of a star’s spectra was used to find Pegasi 51b, the first planet detected around a Sun-like star. This technique is called the radial velocity or “wobble” method. Astronomers split up the visible light emitted by a star into a rainbow. These spectra, and gaps between the normally smooth bands of light, help determine the elements that make up the star. However, if there is a planet orbiting the star, it causes the star to wobble ever so slightly back and forth. This will, in turn, cause the lines within the spectra to shift ever so slightly towards the blue and red ends of the spectrum as the star wobbles slightly away and towards us. This is caused by the blue and red shifts of the star’s light. By carefully measuring the amount of shift in the star’s spectra, astronomers can determine the size of the object pulling on the host star and if the companion is indeed a planet. By tracking the variation in this periodic shift of the spectra, they can also determine the time it takes the planet to orbit its parent star.
      Direct Imaging
      Finally, exoplanets can be revealed by directly imaging them, such as this image of four planets found orbiting the star HR 8799! Space telescopes use instruments called coronagraphs to block the bright light from the host star and capture the dim light from planets. The Hubble Space Telescope has captured images of giant planets orbiting a few nearby systems, and the James Webb Space Telescope has only improved on these observations by uncovering more details, such as the colors and spectra of exoplanet atmospheres, temperatures, detecting potential exomoons, and even scanning atmospheres for potential biosignatures!
      NASA’s James Webb Space Telescope has provided the clearest look in the infrared yet at the iconic multi-planet system HR 8799. The closest planet to the star, HR 8799 e, orbits 1.5 billion miles from its star, which in our solar system would be located between the orbit of Saturn and Neptune. The furthest, HR 8799 b, orbits around 6.3 billion miles from the star, more than twice Neptune’s orbital distance. Colors are applied to filters from Webb’s NIRCam (Near-Infrared Camera), revealing their intrinsic differences. A star symbol marks the location of the host star HR 8799, whose light has been blocked by the coronagraph. In this image, the color blue is assigned to 4.1 micron light, green to 4.3 micron light, and red to the 4.6 micron light. NASA, ESA, CSA, STScI, W. Balmer (JHU), L. Pueyo (STScI), M. Perrin (STScI) You can find more information and activities on NASA’s Exoplanets page, such as the Eyes on Exoplanets browser-based program, The Exoplaneteers, and some of the latest exoplanet news. Lastly, you can find more resources in our News & Resources section, including a clever demo on how astronomers use the wobble method to detect planets! 
      The future of exoplanet discovery is only just beginning, promising rich rewards in humanity’s understanding of our place in the Universe, where we are from, and if there is life elsewhere in our cosmos.
      Originally posted by Dave Prosper: July 2015
      Last Updated by Kat Troche: April 2025
      View the full article
    • By NASA
      Landing on the Moon is not easy, particularly when a crew or spacecraft must meet exacting requirements. For Artemis missions to the lunar surface, those requirements include an ability to land within an area about as wide as a football field in any lighting condition amid tough terrain.

      NASA’s official lunar landing requirement is to be able to land within 50 meters (164 feet) of the targeted site and developing precision tools and technologies is critically important to mission success.

      NASA engineers recently took a major step toward safe and precise landings on the Moon – and eventually Mars and icy worlds – with a successful field test of hazard detection technology at NASA’s Kennedy Space Center Shuttle Landing Facility in Florida.

      A joint team from the Aeroscience and Flight Mechanics Division at NASA’s Johnson Space Center’s in Houston and Goddard Space Flight Center in Greenbelt, Maryland, achieved this huge milestone in tests  of the Goddard Hazard Detection Lidar from a helicopter at Kennedy in March 2025. 

      NASA’s Hazard Detection Lidar field test team at Kennedy Space Center’s Shuttle Landing Facility in Florida in March 2025. NASA The new lidar system is one of several sensors being developed as part of NASA’s Safe & Precise Landing – Integrated Capabilities Evolution (SPLICE) Program, a Johnson-managed cross-agency initiative under the Space Technology Mission Directorate to develop next-generation landing technologies for planetary exploration. SPLICE is an integrated descent and landing system composed of avionics, sensors, and algorithms that support specialized navigation, guidance, and image processing techniques. SPLICE is designed to enable landing in hard-to-reach and unknown areas that are of potentially high scientific interest.

      The lidar system, which can map an area equivalent to two football fields in just two seconds, is a crucial program component. In real time and compensating for lander motion, it processes 15 million short pulses of laser light to quickly scan surfaces and create real-time, 3D maps of landing sites to support precision landing and hazard avoidance. 

      Those maps will be read by the SPLICE Descent and Landing Computer, a high-performance multicore computer processor unit that analyzes all SPLICE sensor data and determines the spacecraft’s velocity, altitude, and terrain hazards. It also computes the hazards and determines a safe landing location. The computer was developed by the Avionics Systems Division at Johnson as a platform to test navigation, guidance, and flight software. It previously flew on Blue Origin’s New Shepard booster rocket.

      The NASA team prepares the Descent and Landing Computer for Hazard Detection Lidar field testing at Kennedy Space Center. NASA For the field test at Kennedy, Johnson led test operations and provided avionics and guidance, navigation, and control support. Engineers updated the computer’s firmware and software to support command and data interfacing with the lidar system. Team members from Johnson’s Flight Mechanics branch also designed a simplified motion compensation algorithm and NASA’s Jet Propulsion Laboratory in Southern California contributed a hazard detection algorithm, both of which were added to the lidar software by Goddard. Support from NASA contractors Draper Laboratories and Jacobs Engineering played key roles in the test’s success.

      Primary flight test objectives were achieved on the first day of testing, allowing the lidar team time to explore different settings and firmware updates to improve system performance. The data confirmed the sensor’s capability in a challenging, vibration-heavy environment, producing usable maps. Preliminary review of the recorded sensor data shows excellent reconstruction of the hazard field terrain.

      A Hazard Detection Lidar scan of a simulated hazard field at Kennedy Space Center (left) and a combined 3D map identifying roughness and slope hazards. NASA Beyond lunar applications, SPLICE technologies are being considered for use on Mars Sample Return, the Europa Lander, Commercial Lunar Payload Services flights, and Gateway. The DLC design is also being evaluated for potential avionics upgrades on Artemis systems.

      Additionally, SPLICE is supporting software tests for the Advancement of Geometric Methods for Active Terrain Relative Navigation (ATRN) Center Innovation Fund project, which is also part of Johnson’s Aeroscience and Flight Mechanics Division. The ATRN is working to develop algorithms and software that can use data from any active sensor – one measuring signals that were reflected, refracted, or scattered by a body’s surface or its atmosphere – to accurately map terrain and provide absolute and relative location information. With this type of system in place, spacecraft will not need external lighting sources to find landing sites.

      With additional suborbital flight tests planned through 2026, the SPLICE team is laying the groundwork for safer, more autonomous landings on the Moon, Mars, and beyond. As NASA prepares for its next era of exploration, SPLICE will be a key part of the agency’s evolving landing, guidance, and navigation capabilities.
      Explore More
      2 min read NASA Gathers Experts to Discuss Emerging Technologies in Astrophysics
      Article 2 hours ago 2 min read NASA Technology Enables Leaps in Artificial Intelligence
      Artificial intelligence lets machines communicate autonomously
      Article 2 hours ago 3 min read In the Starlight: Jason Phillips’ Unexpected Path to Johnson Procurement
      Article 7 hours ago View the full article
    • By NASA
      Intuitive Machines’ IM-2 captured an image March 6, 2025, after landing in a crater from the Moon’s South Pole. The lunar lander is on its side near the intended landing site, Mons Mouton. In the center of the image between the two lander legs is the Polar Resources Ice Mining Experiment 1 suite, which shows the drill deployed.Intuitive Machines NASA’s PRIME-1 (Polar Resources Ice Mining Experiment 1) mission was designed to demonstrate technologies to help scientists better understand lunar resources ahead of crewed Artemis missions to the Moon. During the short-lived mission on the Moon, the performance of PRIME-1’s technology gave NASA teams reason to celebrate.  
      “The PRIME-1 mission proved that our hardware works in the harshest environment we’ve ever tested it in,” said Janine Captain, PRIME-1 co-principal investigator and research chemist at NASA’s Kennedy Space Center in Florida. “While it may not have gone exactly to plan, this is a huge step forward as we prepare to send astronauts back to the Moon and build a sustainable future there.” 
      Intuitive Machines’ IM-2 mission launched to the Moon on Feb. 26, 2025, from NASA Kennedy’s Launch Complex 39A, as part of the company’s second Moon delivery for NASA under the agency’s CLPS (Commercial Lunar Payload Services) initiative and Artemis campaign. The IM-2 Nova-C lunar lander, named Athena, carried PRIME-1 and its suite of two instruments: a drill known as TRIDENT (The Regolith and Ice Drill for Exploring New Terrain), designed to bring lunar soil to the surface; and a mass spectrometer, Mass Spectrometer Observing Lunar Operations (MSOLO), to study TRIDENT’s drill cuttings for the presence of gases that could one day help provide propellant or breathable oxygen to future Artemis explorers.  
      The IM-2 mission touched down on the lunar surface on March 6, just around 1,300 feet (400 meters) from its intended landing site of Mons Mouton, a lunar plateau near the Moon’s South Pole. The Athena lander was resting on its side inside a crater preventing it from recharging its solar cells, resulting in an end of the mission.
      “We were supposed to have 10 days of operation on the Moon, and what we got was closer to 10 hours,” said Julie Kleinhenz, NASA’s lead systems engineer for PRIME-1, as well as the in-situ resource utilization system capability lead deputy for the agency. “It was 10 hours more than most people get so I am thrilled to have been a part of it.” 
      Kleinhenz has spent nearly 20 years working on how to use lunar resources for sustained operations. In-situ resource utilization harnesses local natural resources at mission destinations. This enables fewer launches and resupply missions and significantly reduces the mass, cost, and risk of space exploration. With NASA poised to send humans back to the Moon and on to Mars, generating products for life support, propellants, construction, and energy from local materials will become increasingly important to future mission success.  
      “In-situ resource utilization is the key to unlocking long-term exploration, and PRIME-1 is helping us lay this foundation for future travelers.” Captain said.
      The PRIME-1 technology also set out to answer questions about the properties of lunar regolith, such as soil strength. This data could help inform the design of in-situ resource utilization systems that would use local resources to create everything from landing pads to rocket fuel during Artemis and later missions.  
      “Once we got to the lunar surface, TRIDENT and MSOLO both started right up, and performed perfectly. From a technology demonstrations standpoint, 100% of the instruments worked.” Kleinhenz said.
      The lightweight, low-power augering drill built by Honeybee Robotics, known as TRIDENT, is 1 meter long and features rotary and percussive actuators that convert energy into the force needed to drill. The drill was designed to stop at any depth as commanded from the ground and deposit its sample on the surface for analysis by MSOLO, a commercial off-the-shelf mass spectrometer modified by engineers and technicians at NASA Kennedy to withstand the harsh lunar environment. Designed to measure the composition of gases in the vicinity of the lunar lander, both from the lander and from the ambient exosphere, MSOLO can help NASA analyze the chemical makeup of the lunar soil and study water on the surface of the Moon.  
      Once on the Moon, the actuators on the drill performed as designed, completing multiple stages of movement necessary to drill into the lunar surface. Prompted by commands from technicians on Earth, the auger rotated, the drill extended to its full range, the percussion system performed a hammering motion, and the PRIME-1 team turned on an embedded core heater in the drill and used internal thermal sensors to monitor the temperature change.
      While MSOLO was able to perform several scans to detect gases, researchers believe from the initial data that the gases detected were all anthropogenic, or human in origin, such as gases vented from spacecraft propellants and traces of Earth water. Data from PRIME-1 accounted for some of the approximately 7.5 gigabytes of data collected during the IM-2 mission, and researchers will continue to analyze the data in the coming months and publish the results.
      View the full article
    • By NASA
      2 min read
      Preparations for Next Moonwalk Simulations Underway (and Underwater)
      Multinational corporations are using the M2M Intelligence platform in data centers and other settings. The system offers automated, secure communications on a ground-based global 5G network. Getty Images Artificial intelligence (AI) is advancing rapidly, as intelligent software proves capable of various tasks. The technology usually requires a “human in the loop” to train it and ensure accuracy. But long before the arrival of today’s generative artificial intelligence, a different kind of AI was born with the help of NASA’s Ames Research Center in California’s Silicon Valley — one that only exists between machines, running without any human intervention.

      In 2006, Geoffrey Barnard founded Machine-to-Machine Intelligence Corp. (M2Mi) at Ames’ NASA Research Park, envisioning an automated, satellite-based communication network. NASA Ames established a Space Act Agreement with the company to develop artificial intelligence that would automate communications, privacy, security, and resiliency between satellites and ground-based computers.

      Central to the technology was automating a problem-solving approach known as root cause analysis, which NASA has honed over decades. This methodology seeks to identify not only the immediate cause of a problem but also all the factors that contributed to the cause. This would allow a network to identify its own issues and fix itself. 

      NASA Ames’ director of nanotechnology at the time wanted to develop a communications network based on small, low-powered satellites, so Ames supported M2Mi in developing the necessary technology. 
      Barnard, now CEO and chief technology officer of Tiburon, California-based branch of M2Mi, said NASA’s support laid the foundation for his company, which employs the same technology in a ground-based network. 
      The company’s M2M Intelligence software performs secure, resilient, automated communications on a system that runs across hundreds of networks, connecting thousands of devices, many of which were not built to communicate with each other. The M2Mi company worked with Vodafone of Berkshire, England, to build a worldwide network across more than 500 smaller networks in over 190 countries. The companies M2M Wireless and TriGlobal have begun using M2M Intelligence for transportation logistics. 
      With NASA’s help, emerging industries are getting the boost they need to rapidly develop technologies to enhance our lives. 
      Read More Share
      Details
      Last Updated Apr 29, 2025 Related Terms
      Technology Transfer & Spinoffs Spinoffs Technology Transfer Explore More
      2 min read NASA Engineering Sparks Innovative New Battery 
      Nickel-hydrogen technology is safe, durable, and long-lasting – now it’s affordable too.
      Article 5 days ago 2 min read NASA Tech Developed for Home Health Monitoring  
      Article 3 weeks ago 2 min read NASA Cloud Software Helps Companies Find their Place in Space 
      Article 1 month ago Keep Exploring Discover Related Topics
      Missions
      Artificial Intelligence for Science
      NASA is creating artificial intelligence tools to help researchers use NASA’s science data more effectively.
      Ames Research Center
      Solar System
      View the full article
    • By USH
      Shape-Shifting Materials are advanced, adaptive materials capable of changing their physical form, embedding sensors and circuits directly into their structure, and even storing energy,  all without traditional wiring. Lockheed Martin is at the forefront of developing these futuristic materials, raising questions about the possible extraterrestrial origin of this technology. 

      In a previous article, we discussed why suppressed exotic technologies are suddenly being disclosed. One company that frequently comes up in this conversation is Lockheed Martin, the American defense and aerospace giant known for pushing the boundaries of aviation and space innovation. 
      Imagine an aircraft that can grow its own skin, embed sensors into its body, store energy without wires, and even shift its shape mid-flight to adapt to changing conditions. This isn’t science fiction anymore, Lockheed Martin’s cutting-edge research is turning these futuristic concepts into reality. 
      But where is all this coming from? 
      The rapid development and creativity behind Lockheed Martin’s projects raise intriguing questions. Whistleblowers like David Grusch have recently alleged that Lockheed Martin has had access to recovered UFO materials for decades. Supporting this, Don Phillips,  a former Lockheed engineer,  confirmed years ago that exotic materials have been held and studied by the company since at least the 1950s. 
      This suggests that for over half a century, Lockheed has secretly been engaged in researching and reverse-engineering off-world technologies. It's possible that the breakthroughs we’re seeing today are the result of this hidden legacy. Ben Rich, former head of Lockheed’s Skunk Works division, famously hinted at this when he said, "We now have the technology to take ET home." 
      One particularly stunning development involves "smart" materials that behave almost like muscles, allowing aircraft structures to morph in real-time. These materials enable a craft to fine-tune its aerodynamics on the fly, adjusting instantly to turbulence, speed shifts, or mission-specific demands. 
      Lockheed’s innovations go even further. By embedding carbon nanotubes, extremely strong and highly conductive microscopic structure, directly into the material, they have created surfaces that can transfer information and power without traditional wiring. In these next-generation aircraft, the "skin" itself acts as the nervous system, the energy grid, and the sensor network all at once. 
      You can only imagine the kinds of technologies that have been developed over the years through the reverse engineering of exotic materials and recovered extraterrestrial craft. Yet, governments and space agencies remain tight-lipped about the existence of advanced alien civilizations, who likely introduced these techniques to Earth unintentionally.
        View the full article
  • Check out these Videos

×
×
  • Create New...