Planes Still Fall.

— We’ve spent decades solving technical problems. But what about the human ones?

Scroll Down
Scroll Down
Scroll Down

For decades, this was the sound of a pilot's worst nightmare. A mountain, invisible in the clouds, suddenly appears on a radar screen seconds before impact. We called it “Controlled Flight Into Terrain,” or CFIT — a grimly ironic term for a perfectly good airplane flown by a capable crew straight into the ground. So we built a machine to see the mountains for us. And it worked. The number of CFIT accidents plummeted. We had, in effect, solved the mountain problem. But planes kept falling from the sky.

This story is about the paradox of progress in aviation safety. Over the past four decades, groundbreaking technologies like the Ground Proximity Warning System (GPWS) have virtually eliminated the kind of catastrophic accidents that once dominated headlines. Yet, fatal crashes persist, driven not by a lack of technology, but by its complex interaction with human psychology, training, and decision-making. By analyzing the data from over 40 years of commercial air disasters, we can trace a clear shift in why we crash: from technological blindness to the subtle, but just as deadly, blind spots of the human mind. It’s a story that reveals how our greatest strengths—ingenuity and automation—can create new, unforeseen vulnerabilities.

A321 is not an isolated case.

In the 1980s and early 1990s, CFIT was a dominant and brutal cause of aviation fatalities. Pilots flying in poor weather or at night over unfamiliar terrain were often unaware of their proximity to the ground until it was too late.

Scroll to explore the full scope of global air incidents from 1980 to 2025. It includes operational, technical, weather, and human-factor causes, while excluding terrorism, bombings, hijackings, and missile shoot-downs.

This visualization shows global commercial airliner accidents (1980–2025) involving 10 or more fatalities.

Data compiled and cleaned from Aviation Safety Network (ASN).

Explore the World

Go ahead, drag and spin the globe!

1

The Golden Fix: How a New Warning System Changed the Game

So we built a machine to see the mountains for us. This was the Enhanced Ground Proximity Warning System (EGPWS). It was a revolutionary upgrade to its predecessor, the original GPWS, which was developed by Honeywell in the mid-1970s and mandated by the FAA in 1974.

This new EGPWS, introduced in 1996, was fundamentally different. Using a GPS signal and a detailed global terrain database, it could "see" the terrain ahead, not just below. This allowed it to provide clear, timely warnings, finally giving pilots the one thing they were missing: foresight. This advanced system (also known as TAWS, or Terrain Awareness and Warning System) became mandatory in the U.S. for most turbine aircraft by 2005.

CFIT Crashes 1980–2025

Controlled Flight Into Terrain (CFIT) crashes fell sharply through the 1990s after the introduction of Ground Proximity Warning Systems.

The solution was technological, and its impact was undeniable. The number of CFIT accidents plummeted. We had, in effect, solved the mountain problem. Advanced warning systems like EGPWS were supposed to make Controlled Flight Into Terrain a thing of the past.

And yet… the mountains still win. Why?

2

Why Crashes into Terrain Didn’t Disappear

The Human Problem: When the Voice Is Ignored

The arrival of the EGPWS caused accident rates to plummet. Yet, this technological fix concealed a deeper vulnerability. According to a landmark report by the International Air Transport Association on crashes from 2005–2014, the few CFIT accidents that still occurred were incredibly lethal— 88% of them resulted in fatalities. More critically, IATA found "no evidence" that the warning systems on these aircraft "malfunctioned" or failed to function as designed. The problem, the report concluded, was "poor pilot response."In the 2010 Airblue crash, this meant ignoring 21 warnings in the final 70 seconds.

Why would highly trained pilots ignore a voice designed to save their lives? According to Scott Shappell, professor of Human Factors and Behavioral Neurobiology at Embry-Riddle Aeronautical University, it’s not defiance—it’s psychology. He explains that pilots can suffer from "warning fatigue," where repeated alerts "all blend together, and we tend to ignore them."In moments of intense stress, the human brain enters a state of cognitive tunneling or fixation, where the brain’s limited resources are channeled into what it perceives as the most critical task, causing pilots to literally not hear or process other vital information.

Shappell offered a powerful everyday analogy to explain this high-stakes phenomenon. He described his mind wandering to a missed work deadline while driving his family on the weekend. "All of a sudden I think, 'Jeez, I forgot to get that report in on Friday.' Now all I'm thinking about is work," he recalled. He continued, "It doesn't matter what my wife is saying to me, it doesn't matter that the kids are singing. I'm not hearing anything."He was so fixated, he found himself "driving down the same road I drive every day to go to work," instead of turning towards the mall. This is cognitive tunneling—a state where crucial inputs (a wife’s voice, a cockpit warning) become invisible.

This psychological blind spot is exactly what aviation tried to solve with Crew Resource Management. As Shappell explains, this structure is designed to "back each other up, so while one person may be distracted, hopefully the other one won't be."But this solution is often compounded by a cultural one: authority. Shappell notes that in a U.S. cockpit, copilots are trained to challenge a captain if safety is at risk. But in some Asian cultures, hierarchy makes that much harder—respect can override reaction. "Challenging your elders is very difficult," Shappell notes. He cites the 1997 Korean Air crash in Guam (KAL 801) as an example: the crew corrected their tired captain's mistake once, but then fell silent out of respect. "They're quiet as church mice," Shappell recalled, "and as a result, they end up flying right into... Nimitz Hill."

From the wiring of the brain to the structure of the cockpit, the problem is undeniably human. "Human error is just normal," Shappell stated. "It's normal human behavior. We all make mistakes, and we'll continue to make mistakes the rest of our lives."

But this 'human problem'—our subtle, deadly blind spots—isn't limited to the cockpit, and it doesn't only lead to CFIT. When this same human error occurs on the ground—in the design blueprint or the maintenance hangar—it manifests in an older, equally fatal way: Mechanical Failure.

3

Mechanical Failures: From Neglected Bolts to Hidden Code

Mechanical Failure Crashes 1980–2025

As aviation evolved, many old mechanical errors were engineered out—only for new ones to emerge in the form of human decisions embedded in software and automation.

The very nature of mechanical failure represents an evolution of the human problem. In the past, the error was physical and tangible, such as the 2000 crash of Alaska Airlines Flight 261, caused by the improper lubrication of a critical jackscrew—a human oversight you could touch. Today, the error is digital and invisible, as revealed by the Boeing 737 MAX crashes: the "failure" here was not a loose bolt but flawed lines of software code (MCAS), rooted in flawed human assumptions made by designers and flawed human oversights made by regulators. This is the next phase of the human problem: today's complex automation systems are creating new, more subtle kinds of human error.

Shappell warns that we have begun to over-trust these complex systems, a reliance he calls "automation trust.""Automation is not a panacea," he stated. "Automation is not perfect."He compares this dangerous reliance to driving: "My truck has adaptive cruise control. If I rent a car, I put the cruise control on and assume that it has adaptive cruise control. But if it doesn't, I run right into the back of the car in front of me when he slows down."Just as a digital clock, with its AM/PM settings, can be easier to set incorrectly than an old analog clock, Shappell concludes, "the automation itself can set you up for failure, and the mistakes can be drastic."

"Every beep and warning light in the cockpit is built from someone’s last flight. But the hardest system to upgrade is still the human mind."

So, if human error is inevitable, what is the path forward? On the one hand, the aviation industry's regulatory system is incredibly mature. Shappell describes it as a powerful Big Brother system that is "heavily monitored and heavily regulated universally." This mature system stops intentional violations — a pilot cannot "decide, 'You know what, I'm going to do, a little barrel roll,' and get away with it" — but it still struggles to prevent unintentional errors born from cognitive tunneling or automation trust. The real solution lies in designing our understanding of human psychology into the machines themselves.

Shappell stressed the key is to "understand how people make decisions."He offered a simple example: if you ask your kids, "'Hey, where do you want to go to dinner tonight?' and just leave it open, they'll argue every time." But if you limit the choice to, "'Hey, do you want to go to A or B?', then it's easy." "You want to understand how people make choices," he said. "That's what drives design."

This is the true challenge ahead. It is not about a distant future, but a very specific human-factors problem: the "heavy pressure in the industry to reduce the crew size to one person." As we replace the second pilot with more complex automation, we must design systems that truly understand how humans think, how they fixate, and how they (mis)trust. As Shappell concluded: "As long as there are humans in the loop, there will always be error. And so the question is not, 'how do I prevent the error,' it's, 'how do I manage that error and mitigate the consequences?'" It is a more profound challenge, but it points the way forward: We must stop expecting humans to simply adapt to machines and start designing machines that are adapted to us.

Year range
1980 1985 1990 1995 2000 2005 2010 2015 2020 2025