Data from NASA’s National Aviation Operations Monitoring Service (NAOMS) project, a survey administered to pilots to track safety-related incidents, should not be used to measure rates or trends in safety in the National Airspace System, says a new report from the National Research Council. Deficiencies in several aspects of the survey design make the data gathered unreliable, according to council officials.
The NAOMS survey — jointly managed by NASA and the Battelle Memorial Institute and given to more than 29,000 pilots from April 2001 through December 2004 — was developed as a means to statistically track rates of safety-related incidents and detect trends in those rates over time. Pilots answered questions regarding numbers of hours and flights flown, and numbers of incidents observed. Although the survey employed a number of generally accepted practices — such as computer-assisted telephone interviews and professionally trained interviewers — several flaws in the design and implementation of the study affected the usefulness of the data that were gathered, council officials assert.
According to the committee that wrote the report, many of the survey questions had deficiencies in structure and wording — including complex structure, multiple-part questions, and vague phrases — making them potentially difficult for respondents to digest during a telephone interview. These problems may have led to varying interpretations and judgments on the part of the pilots, impacting the effectiveness of the survey at accurately measuring safety-related incidents.
Substantial fractions of the reported non-zero counts of events and reported flight legs and hours flown had implausibly large values, the committee said. Furthermore, many respondents appear to have rounded off event counts or reported flight hours to convenient numbers (e.g., numbers ending in 0 or 5), raising serious concerns regarding the accuracy and reliability of the data. Some of these problems could have been prevented by spending more time ensuring accuracy during the interview and data-entry stages, the report says.
Several choices made by the NAOMS project team caused potential biases in the survey’s outcome. The survey was restricted to pilots with specific flight certifications, which led to an over-representation of pilots of wide-body aircraft, and an under-representation of those using small aircraft. Pilots were also able to opt out of the database used to contact potential respondents, and those with commercial licenses opted out at a much higher rate than others. If those who opted out had witnessed or experienced very different rates of safety events than those who took the survey, the data would be substantially biased, the report noted, adding the NAOMS team should have investigated and identified potential biases early on to develop alternative survey-design strategies.
In addition, the survey grouped highly disparate segments of the aviation industry in order to calculate the overall rate of safety incidents. As a result, there is no way to link the rate of events to aircraft type, or operating environment, hindering any meaningful analysis of the NAOMS data.
Issues such as these should have been anticipated and addressed in the design stage of the NAOMS project, the report says. However, the committee found no evidence that the NAOMS team developed or documented their plans for data analysis, nor did they conduct preliminary analyses as data initially became available in order to identify problems early on and refine their survey methodology.
The study was sponsored by NASA. The National Academy of Sciences, National Academy of Engineering, Institute of Medicine, and National Research Council are private, nonprofit institutions that provide science, technology, and health policy advice under a congressional charter. The National Research Council is the principal operating agency of the National Academy of Sciences and the National Academy of Engineering.
For more information: National-Academies.org