Page:UAP Independent Study Team - Final Report.pdf/31

From Wikisource
Jump to navigation Jump to search
This page has been validated.

U.S. Federal agencies that could support the effort to understand UAP include the DoD, Department of State, the FAA, the Department of Commerce (DoC) and major agencies within DoC including NOAA, the National Institute of Standards and Technology, and the Bureau of Ocean Energy Management, plus the DoE and NSF.

Data on UAP

Status of Existing Data

NASA collects an enormous amount of data using highly-calibrated, validated equipment from a variety of environments and domains across the entire Earth. Could NASA bring this same approach of rigorous science to UAP?

Before we can apply the scientific method to understanding an unusual phenomenon, the relevant data must first meet standards for data-driven approaches. Many such standards have been codified over time, including the FAIR data principle—an acronym for Findability, Accessibility, Interoperability, and Reusability[1]. We followed these and other similar principles when reviewing the current status of data on UAP, and that analysis led to the findings and recommendations in this report.

UAP data are rarely, if ever, collected in a concerted effort to understand the phenomenon; they are usually coincidental observations. Often, observations of UAP are made using instruments or sensors that have not been designed or calibrated to detect anomalous objects, and to constrain their movement parameters. Metadata (meaning sensor type, manufacturer, noise characteristics, time of acquisition, instrument sensitivity, information about the data storage such as bit-depth, location of the sensor, conditions of the sensor such as temperature, exposure characteristics, and so on) are often absent, making calibration and a thorough understanding of context difficult. So, there is correspondingly limited information associated with many of the unresolved UAP reports—even if several reports are accompanied by photographic or videographic evidence.

As a result, existing observations are neither optimized for studying UAP nor are they suited for a systematic scientific analysis.

In addition, much of the data collected by military sensors or intelligence satellites are classified—often because of what the imagery could reveal about U.S. technical capabilities to our adversaries, and not because of what is actually in the images. While essential for security, these classified observations enhance the sense of mystery and conspiracy surrounding UAP, and they present an obstacle to scientific inquiry.

For many events, the data and metadata did not enable a conclusive characterization of the size, motion, or nature of the UAP. Yet, where it did, such as in the "GoFast" UAP video, the apparent anomalous behavior of the UAP can often be explained by the motion of the sensor platform[2].

In contrast, NASA observations are made using well-calibrated instruments that have been designed for their specific use cases. This is how NASA can scientifically approach the study of Earth- and space-based phenomena.

In science, data need to be reproducible, and hypotheses falsifiable—the scientific method works by systematically analyzing data with the intent to falsify a hypothesis.

As a general principle, the data should support measurement that can rule out specific explanations or interpretations, leaving us with no choice but to embrace its opposite. In the case of UAP, the hypothesis we seek to reject (or "null hypothesis") is that the UAP have phenomenology consistent with known natural or technological causes. Eyewitness reports should be considered along with corroborating sensor data in the study of UAP as reports may reveal patterns (for example, clusters in time or location). Yet, without calibrated sensor data to accompany it, no report can provide conclusive evidence on the nature of UAP or enable a study into the details of what was witnessed. While witnesses may be inherently credible, reports are not repeatable by others, and they do not allow a complete investigation into possible cognitive biases and errors (such as accuracy in perception, or misperception caused by environmental factors, errors in the recording device, judgment or misjudgment of distance or speed, for

  1. https://www.nature.com/articles/sdata201618
  2. Dr. Sean Kirkpatrick's presentation to this committee, May 31, 2023


29