Guides

Why pilots think about safety differently from everyone else

Aviation doesn’t treat safety as “who’s to blame” but as a whole system of people, machines, weather and culture. As a new pilot, you’re already part of that system — and you can quietly redesign it to make every flight more forgiving.

A bad day at LAX

On a clear night in 1991, USAir 1493 touched down at Los Angeles after an uneventful flight. Seconds later, the 737 slammed into a small commuter plane sitting on the same runway, crushing it, skidding off the pavement, and erupting in fire. Thirty‑five people died.

In the tower, local controller Robin Wascher realised, with horror, that she had cleared the 737 to land on an occupied runway. She had told SkyWest 5569 to “taxi into position and hold” at an intersection, then became busy with other traffic. Amid frequency mix‑ups, a crossing clearance, and a departure on the parallel runway, she simply forgot that SkyWest was still there. (Read the NTSB's report here.)

It was obviously a mistake with catastrophic consequences. The public wanted to know: would she be punished? Lose her job? Go to jail?

She wasn’t punished. She was never charged. She cooperated fully with investigators, told the truth about her error, and later chose not to return to controlling. And from aviation’s point of view, that is usually how it should be.

💡
For a new pilot, this is one of the most important and surprising things about aviation: it does not treat every honest mistake as a crime. It treats it as data.

How aviation frames safety

In most fields, when something goes wrong, people look for someone to blame. Authorities promise to “find those responsible” and “hold them to account.” The focus is who.

Aviation learned the hard way that this is a dead end. If you remove one controller or one pilot but keep the same system, you have not made anything safer. Someone else will eventually make the same mistake.

So aviation asks a different question: why did this happen in this system, on this day, with this ordinary human in the loop?

At LAX that night, the National Transportation Safety Board (NTSB) found that:

  • the ground radar at Wascher’s station was inoperative
  • her view of the intersection where SkyWest waited was blocked by a new terminal
  • another controller failed to pass on information, forcing her to fix it herself
  • procedure allowed “taxi into position and hold” at night on a busy runway
  • ground conflict alerts did not exist
  • the SkyWest crew had most exterior lights off while waiting, as per their normal practice

Wascher’s slip was real, but it was just one weak link in a badly designed chain. If 35 people can die because one person has to be perfect for hours, the fundamental problem is the system, not the individual.

That is the lens aviation uses on you as a pilot, too.

The layers behind every flight

Every flight you make sits on several layers.

  • You: skills, fatigue, recency, mindset, confidence.
  • Team: instructors, other pilots, ATC, maintenance, dispatch.
  • Technology: aircraft design, avionics, alerts, checklists, “known quirks”.
  • Organisation: school or club policies, booking pressures, attitudes to cancellations, go‑arounds and fuel.
  • Environment: weather, terrain, traffic, airspace complexity, local conditions.

Before and after a flight, scan across these layers and ask which ones made things safer, and which ones eroded your margin.

The Swiss cheese model is one way to picture it. Each layer is a slice of cheese. Each has holes: today you arrived late and rushed pre‑flight; the aircraft is just out of maintenance; the weather is marginal; the club is quietly pushing to “get it done”.

An accident happens when those holes line up. Instead of asking “Can I technically do this flight?”, ask:

  • How many slices have holes today?
  • Are they pointing in the same direction?
  • What can I change — slow down, add fuel, alter the route, raise my minima, delay or cancel?

The win is not a clever explanation afterwards. It is catching the lineup before it forms.

Why aviation runs blameless investigations

After decades of crashes, aviation converged on a simple principle, now baked into Annex 13 of the Chicago Convention: the primary purpose of an accident investigation is to prevent future accidents, not to assign blame.

That is why the NTSB states up front that its work is “fact‑finding… not conducted for the purpose of determining the rights or liabilities of any person.”

💡
When liability is off the table, people tell the truth. Investigators can ask: given normal human fallibility, how do we design a system where this error is hard to make and easy to catch?

The LAX crash led to concrete changes:

  • more reliable and widespread ground radar
  • automated ground collision alerting
  • tighter rules around holding on the runway in low visibility
  • better lighting and visibility of intersections

None of this would have happened if the story stopped at “one bad controller.”

Just culture and you

This approach is often called just culture. The idea:

  • Honest mistakes, made in good faith, are treated as learning material, not crimes.
  • People are encouraged to report their own errors so the system can see its weak points.
  • Discipline is reserved for reckless, intentional or repeat violations, not normal human slips.

Without a just culture, pilots and controllers hide their mistakes. Organisations only discover vulnerabilities when something catastrophic happens. With a just culture, near‑miss reports, anonymous safety forms, and frank debriefs surface problems early.

For you as a new pilot, this has two implications.

First, you are expected to be honest about your own errors. A botched fuel calculation, a blown call, a confusion in the circuit — these are chances for the system to learn, not shame to bury.

Second, when you hear stories of incidents, listen for the system, not just the person. What in the training, procedures, tools, or culture made that error more likely or harder to catch?

Managing your mental bandwidth

Aviation is also unusual in how deliberately it manages cognitive load. Your mental bandwidth is limited. The system is built around that fact.

Takeoff, climb, arrival, circuit and low‑level manoeuvring are treated as protected phases. You are expected to:

  • strip away non‑essential tasks
  • keep the configuration simple
  • prioritise “aviate, navigate, communicate” in that order

Alerts and information are not just isolated beeps; they are part of a flow you manage. In busy moments, it is acceptable — even smart — to consciously ignore minor advisory messages and return to them when workload drops.

The path of the aircraft and separation from other traffic come first. Everything else is decoration until you have capacity.

Learning from each flight as a system

Because aviation treats safety as a system, it also treats learning that way.

A good debrief does not only list what you did right or wrong. It asks what the system taught you today.

  • What surprised you — weather, traffic, ATC, your own performance?
  • Where did you feel rushed or confused, and what combined to create that overload?
  • Are there “normal” shortcuts everyone seems to take that quietly eat into safety margins?

From each flight, pick one system change:

  • a new personal rule
  • a tweak to your checklist layout
  • a clearer passenger briefing
  • a suggestion to your club about fuel, bookings or circuit procedures

This is your small‑scale version of what the NTSB does after a crash.

Designing your own safety tools

You are constantly designing your own safety tools, whether you realise it or not. Your kneeboard, flows, briefings, EFB layout and personal minima are all design choices.

Think like a designer:

  • Under pressure, which checklist items do you skip?
  • Which calls get rushed?
  • When do you or your friends look lost?

Adjust the design, not just your willpower. Simplify layouts, highlight what is critical, write short standard briefings you can actually say, and test them in the sim, in chair‑flying, or on dual flights.

If you still stumble, that is feedback on the design. Change it again.

Why this mindset makes aviation unique

Today, airline flying is extraordinarily safe. In the early 1970s, about one in 200,000 airline passengers worldwide did not reach their destination alive. By 2022, that number was around one in 17 million. In the United States, there has not been a fatal crash of a scheduled passenger airline in over a decade.

That record is not an accident. It comes from treating human error as inevitable, and designing systems that are forgiving of it. The cause of a tragedy like the Los Angeles runway collision was not “one bad controller.” It was an unforgiving system that required average humans to act with inhuman consistency.

As a new pilot, you are already part of this culture. Your job is not to be perfect. Your job is to fly safely inside a system that expects you to be human — and to quietly improve that system, one honest report, one better checklist, one thoughtful debrief at a time.

Related Intelligence

Singapore

General aviation training schools in Singapore

Here is a working list of orgs in or from Singapore that offer general aviation training pathways (PPL, RPC / RAAus, RPL, CPL, etc). This is based on public information and may not be complete. Always confirm details with CAAS and providers directly.

Guides

7 important tips if you’re a new pilot

Becoming a pilot is a long, serious project, not a quick flex. Sort your medical first, move as much learning as possible to the ground, be honest about money and time, build solid habits, protect the fun, and lean on the community to keep flying for the long term.