There’s a moment in the 1998 film “Ronin” where Robert DeNiro’s character says to Jean Reno’s character, “Whenever there is any doubt, there is no doubt.”
In their world of espionage, the ability to recognize true, inner self-doubt and then act on it could be the difference between life and death.
More than one flight instructor I learned from put it another way: “Trust your gut.”
It’s not just about trusting that you’re right. It’s about trusting your sense that something’s wrong.
Based on the following reports to NASA’s Aviation Safety Reporting System, many pilots have trouble acting on their inner self-doubt.
A GA pilot filed a NASA report after self-doubt propelled him to change course without contacting ATC. He filed an IFR flight plan from Chesapeake Regional Airport (KCPK) to his destination. His departure clearance included climbing to 3,000′ on a 230° heading. After takeoff, he contacted Norfolk Approach. Approach told him to climb and maintain 3,000′. ATC also gave him the Norfolk altimeter setting, but nothing else.
“After flying on the 230 heading for a minute, I began to question in my mind as to whether the controller had forgotten about me flying off on a heading that was about 90° off from my clearance route to Hopewell VOR (HPW),” he wrote.
He began to think that ATC had forgotten about him. He then convinced himself the controller must have expected him to turn toward the next fix (HPW) on his own, which he did, without querying ATC first.
ATC immediately contacted the pilot and directed him back to the 230 heading.
“Of course I should have stayed on the 230 heading until given radar vectors or further clearance,” he wrote. “I pride myself on being a careful and exacting pilot, yet this was plain and simple a mistake.”
Is it possible his mistake came from too much pride?
“A simple call to ATC would have been in order to clarify any uncertainty on my part,” he concluded.
Why did he ignore his self-doubt, instead acting assertively and incorrectly?
A military-trained helicopter pilot filed a NASA report after violating Bob Hope Airport (KBUR) Class C airspace with the full certainty that he was in the right.
The mission was to conduct a low-level flight from the Queen Mary in Long Beach Harbor up the Los Angeles River, past Dodger’s Stadium, over the Hollywood Hills, west along the Ventura Freeway, south along the 405 Freeway, and then back west to the Santa Monica Pier before finally flying south out over the ocean and along the coast back to base at Los Alamitos Army Airfield (KSLI).
Dozens of aircraft of varying sizes, categories, and speeds transit this small patch of Southern California airspace at any given moment all day and night. That route of flight had this pilot in hailing range of eight Class D airports, inside a Class C and entirely enveloped by the Class Bravo of Los Angeles International (KLAX). His route of flight put him in a region where a combination of topography, population density, and building height differentials demand constant attention below 500′ AGL.
And yet, on a CAVU VFR day, he lost situational awareness.
“As the non-flying pilot, I was responsible for navigating the route while my junior co-pilot flew the visual checkpoints,” he wrote.
After transiting Long Beach Airport’s (KLGB) airspace north along the LA River, LAX Tower released him. He dialed in the wrong frequency for that sector and made a call in the blind on 120.95. He and his co-pilot flew the river up to where he thought Dodger Stadium would be.
“I was sitting left seat, and was not able to visually identify Dodger Stadium,” he wrote. “I instructed my co-pilot to continue north along I-5 where we were expecting to see the stadium. I let this task saturate me and lost situational awareness.”
Instead of fessing up, orbiting or calling any available, nearby Tower frequency, the pilot trusted his GPS — even though his GPS had just failed to help him identify Dodger Stadium.
The pilot then flew west along Ventura freeway into KBUR Class C airspace, while the pilot navigating failed to switch to KBUR’s assigned Tower frequency. Instead he remained on 120.95. Despite the GPS routing on his mission display, he was not aware he’d authorized the flying pilot to transit inside KBUR airspace without permission.
They flew on to their next VFR checkpoint, where the Hollywood Hills, the I-405, and Ventura Freeway meet. The helicopter had been incognito, inside KBUR airspace for seven miles. They finally left KBUR airspace and entered Santa Monica’s (KSMO) Class D, where the KSMO Tower contacted them on 120.95, the KLAX frequency. ATC demanded that the military helicopter identify itself.
“I immediately responded and was informed of the airspace violation. Due to my loss of situational awareness, overconfidence in GPS systems, and failure to ensure my flying pilot vocalized and identified visual checkpoints along the route, we were completely in inadvertent disoriented flight at this time.”
The pilot admitted that he was so overconfident in the GPS system that he believed it would make up for any holes in the briefing he gave before the flight.
In the beginning of his report, he wrote that he’d given a thorough preflight briefing based on preflight planning. But in his conclusion, he admitted that his preflight planning had been weak in specific areas. Those areas included failure to list and input radio frequencies he’d need to use during each segment of the flight; failure to have a contingency plan if unable to identify the mission’s checkpoints; and belief that ATC would contact him should their flight create an airspace incursion or a near midair collision (NMAC).
This pilot prided himself on the fact that military aviators are extremely well trained. Why was his preflight planning so flawed? Why did he allow, in his own words, “our aircraft to deviate from the assigned flight path and jeopardize the safety of the FAA and airspaces in the area?”
Why did both the GA pilot and the military pilot fall prey to the same mistake?
Two possible reasons: “Actor-observer bias” and “confirmation bias.”
When people judge their own behavior, and they are the actor, they are more likely to attribute their actions to the specific situation than to their personality. Yet when an observer is explaining the behavior of another person (the actor), they are more likely to attribute this behavior to the actor’s overall disposition rather than to situational factors. That’s actor-observer bias.
Neuroscientists and sociologists all agree that human beings possess a limited ability to process the terabytes per minute of information coming at us. As such, our brains have evolved an efficient filtering mechanism called confirmation bias. It’s the human mind’s tendency to notice and pay more attention to objects and experiences that match its preexisting thoughts and beliefs.
The GA pilot prided himself “on being a careful and exacting pilot.” It didn’t cross his mind during the event that his impatience was the problem. Because he didn’t see his own impatience as a vice, he felt justified in acting on it, initiating a turn without direction from ATC. Actor-observer bias. He even alluded to this in his report when he admitted that his action was foolhardy, and he’d never abide it in another pilot.
He realized that his impatience drove him to a bad decision. Because he was impatient to get on course, he chose to interpret ATC’s lengthy lack of communication with him in a way that would confirm for him the soundness of his decision to deviate from the flight plan. He chose to think they’d forgotten about him. Confirmation bias.
The military helicopter pilot wrote that military aviators are extremely well trained. Being a military aviator, he therefore must be an extremely well-trained pilot. Therefore his preflight planning must be spot on. Confirmation bias.
Later, in his debrief and NASA report, the same pilot chided himself for his inability to accept his own doubts about identifying Dodger Stadium, about communicating with ATC, about his own confidence in the onboard GPS.
“If there is a question, then there is no question,” he wrote in his conclusion. “This means if there is any grain of doubt in airspace restrictions, clearances, or operations that I speak up and clarify in order to regain situational awareness and confidence to safely transit the route.”
Michael Jordan famously said, “I am always wrong about everything, and that’s why my life improves.” In other words, Jordan learned that a little self-doubt was a good thing.
I’m not sure I’m willing to admit I’m wrong about everything in order to improve my life. But I do agree that part of being a great pilot is admitting you’re wrong and seeking to learn what you did incorrectly.
As pilots, we may not be Ronin, but our ability to trust our inner doubt and communicate it to others can be the difference between an NMAC and an uneventful flight.