The arrival of the EGPWS caused accident rates to plummet. Yet, this technological fix concealed a deeper vulnerability. According to a landmark report by the International Air Transport Association on crashes from 2005–2014, the few CFIT accidents that still occurred were incredibly lethal— 88% of them resulted in fatalities. More critically, IATA found "no evidence" that the warning systems on these aircraft "malfunctioned" or failed to function as designed. The problem, the report concluded, was "poor pilot response."In the 2010 Airblue crash, this meant ignoring 21 warnings in the final 70 seconds.
Why would highly trained pilots ignore a voice designed to save their lives? According to Scott Shappell, professor of Human Factors and Behavioral Neurobiology at Embry-Riddle Aeronautical University, it’s not defiance—it’s psychology. He explains that pilots can suffer from "warning fatigue," where repeated alerts "all blend together, and we tend to ignore them."In moments of intense stress, the human brain enters a state of cognitive tunneling or fixation, where the brain’s limited resources are channeled into what it perceives as the most critical task, causing pilots to literally not hear or process other vital information.
Shappell offered a powerful everyday analogy to explain this high-stakes phenomenon. He described his mind wandering to a missed work deadline while driving his family on the weekend. "All of a sudden I think, 'Jeez, I forgot to get that report in on Friday.' Now all I'm thinking about is work," he recalled. He continued, "It doesn't matter what my wife is saying to me, it doesn't matter that the kids are singing. I'm not hearing anything."He was so fixated, he found himself "driving down the same road I drive every day to go to work," instead of turning towards the mall. This is cognitive tunneling—a state where crucial inputs (a wife’s voice, a cockpit warning) become invisible.
This psychological blind spot is exactly what aviation tried to solve with Crew Resource Management. As Shappell explains, this structure is designed to "back each other up, so while one person may be distracted, hopefully the other one won't be."But this solution is often compounded by a cultural one: authority. Shappell notes that in a U.S. cockpit, copilots are trained to challenge a captain if safety is at risk. But in some Asian cultures, hierarchy makes that much harder—respect can override reaction. "Challenging your elders is very difficult," Shappell notes. He cites the 1997 Korean Air crash in Guam (KAL 801) as an example: the crew corrected their tired captain's mistake once, but then fell silent out of respect. "They're quiet as church mice," Shappell recalled, "and as a result, they end up flying right into... Nimitz Hill."
From the wiring of the brain to the structure of the cockpit, the problem is undeniably human. "Human error is just normal," Shappell stated. "It's normal human behavior. We all make mistakes, and we'll continue to make mistakes the rest of our lives."
But this 'human problem'—our subtle, deadly blind spots—isn't limited to the cockpit, and it doesn't only lead to CFIT. When this same human error occurs on the ground—in the design blueprint or the maintenance hangar—it manifests in an older, equally fatal way: Mechanical Failure.