12 May 2021
There’s a cosy quote, widely and, as it turns out, wrongly attributed to Mark Twain, which perfectly encapsulates my most common, regrettable, and easily avoidable errors in both flying and writing. “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” I think the saying, versions of which appeared long before the dawn of powered flight, applies especially broadly in aviation.
During a recent IFR flight to Caldwell, New Jersey, for example, the recorded voice on the automated terminal information service (ATIS) said the RNAV approach to Runway 22 was in use. I dutifully loaded that approach into the aeroplane’s GPS, checked in on the radio, and was told to proceed to a fix for an approach to Runway 28.
I searched in vain for the fix, and it wasn’t until a fellow pilot (who had happened to come along on this trip as a passenger) pointed out my error that the confusing situation was resolved. I had expected one approach, got cleared for another, yet the actual clearance didn’t register because I’d already written down and loaded the other one. I was hearing without listening.
“Our default position should be one of not knowing. When questions come up in flight, we ought to be inquisitive”
The error, fortunately, was resolved without permanent harm. But there are plenty of examples of aircraft running out of fuel, landing with the gear in the wrong position, and suffering other serious consequences as a result of things pilots knew, or assumptions they made, that turned out to be false.
Pilots have long sought to project an image of omniscience – and that makes knowing what ‘ain’t so’ all the more likely.
Our default position should be one of not knowing. When questions come up in flight, as they always do, we ought to be inquisitive. If you think you know the runway length, the latest wind, the airport elevation, congratulations. Double-checking will provide reassuring confirmation.
The best pilots I know are curious. They’re interested in new information, in learning, and challenging their own assumptions – and, most of all, they accept and are even amused by their human fallibility. New information that differs from what they thought they knew, or even proves them wrong, isn’t rejected, explained away, or cause for angst. They welcome it as timely and accurate information that will enable them to make smart choices.
You know an avionics shortcut? Show it to me so I can learn it.
Flown to an strip before and know some tips? Tell me about it.
I was overflying an area of bad weather at night and the aeroplane’s radar reported the cloud tops were 6,000ft below my current altitude. That seemed like a safe margin at the time. But redirecting the radar to a close-in, level scan showed towering spires of stormy weather reached to our aircraft’s altitude and beyond. The reported cloud tops turned out to be a fiction that bore little resemblance to actual flight conditions.
A more sceptical reading of the weather picture, and a timely diversion away from the bad weather rather than over it, would have been a wise choice. Instead, I blundered through an area of violent weather that peeled the paint off the aeroplane’s radome.
I went to school to learn a ‘new-to-me’ aeroplane last year, and my biggest obstacle in training was my desire to impress my instructors with how thoroughly I had prepared and how much I already knew as a result. I had studied a great deal in advance, and I wanted to show that I took the coursework seriously and had done my homework. But relentlessly seeking to demonstrate my newfound knowledge was at odds with learning. A more secure pilot with an open mind would have asked questions, listened better, and absorbed the flood of new information quicker. By trying to show how much I already knew, I missed opportunities to learn from real subject matter experts with many years of real-world experience – and that could have provided real learning.
It’s a mistake I don’t intend to repeat.
Doing the homework well in advance was fine. Obnoxiously blurting out answers to anticipated questions before the instructors even asked them was not.
The more experience pilots gain, the more we should question our own assumptions, double check all the information we can, and acknowledge the things we don’t know.
I was so sure that the quote at the top of this column was said by Mark Twain that it didn’t occur to me to check it. Neither, apparently, did the makers of the movie The Big Short who used it, and attributed it to Twain, at the start of the movie. I only stumbled upon the mistake when searching for something unrelated – and I’m glad I found the mistake before repeating it.
Only by admitting what we don’t know that pilots can avoid things that just ain’t so.
RV-4 pilot, ATP/CFII, specialising in tailwheel and aerobatic instruction
[email protected]