Business & GA, Commercial

Avionics System Design: Saving the Pilot: Part I

By Walter Shawlee 2 | October 1, 2002

Recently I had to investigate a small plane crash, and it struck me how easy it is for things to go wrong in the cockpit. The pilot can be certain he is doing the right thing up to the moment of impact. Coupled with recent security issues related to the 9/11 hijackings, it therefore seems topical to examine the issues that affect the pilot’s safety, and how we can improve his or her survival odds from a design viewpoint.

There are three major areas of concern:

  • The operation of systems, in terms of their expected and unintended behavior;

  • Recovery from abnormal operation; and

  • Security in flight, an issue highlighted by events like 9/11.

I plan to cover the three areas in three columns, starting with this one.

Systems operation is a critical concern and one in which pilots have good reason to feel the manufacturers have let them down. The most pervasive problem is that pilots often can not understand how most avionics systems operate without resorting to extensive documentation, which is usually unavailable in flight. Worse, in more advanced systems, pilots easily can set up an unexpected mode of operation that may be problematic and provide no obvious method of recovery. The pilot is left with no place to turn.

Many systems simply report data–from fuel content to temperature–that may have no valid indication. Pilots therefore need a way to quickly test functions or have redundant status indication, so they can decide on the data’s usefulness. I’m always astounded of how many accidents still are caused by fuel exhaustion.

In the investigation of a crash due to fuel exhaustion some years ago, we found that the pilot turned down a chance to refuel only minutes before the accident because his fuel gauge told him he had an adequate supply. The real failure that created the accident was a bad breaker in the electrical system, but the gauge’s design let the indicator sit at its last position before the power failure. A simple flag or light to show "ON" could have prevented this accident.

In older-generation avionics systems (Silver Crown, Microline, Proline, etc.) a direct one-for-one correspondence existed between a physical control and a function. Setting frequencies or modes were quick and (yes, I timed it) could usually be done in as little as 3 to 5 seconds, with minimal pilot attention and often without actual sight of the system. However, newer systems, which tend to have slewed and shifted or combined controls and menus, can take much more time to control and require intense pilot focus for even the simplest tasks. Often the pilot is unsure of what is going on with the system in many intermediate and data entry modes.

The issue simply is that, once trapped in a non-operational mode within a system, the pilot often has no way of understanding and quickly correcting the situation. The loss of operation may be serious but not obvious, whether it be loss of radio communication, suspended system status or incorrect nav initialization and defeated functions. When a system’s operation is less than obvious, it can be derailed by what would seem to be intelligent choices. Add a large dose of cryptic acronyms and abbreviations and the "murk coefficient" rises even more. Designers may be surprised to learn that words with more than three characters exist and that vowels are permitted on panel markings.

Two recent studies, at Rhode Island University and Carnegie-Mellon, revealed what was already clear to many: that cell phone operation is a dangerous distraction to car drivers. The problem is the overwhelming brain load required to carry on a conversation while still performing another, critical task. There’s a lesson in those studies for avionics system designers: systems must not steal the pilot’s focus if safe flight is to be maintained.

To be safe, systems must be fast and unambiguous to operate, and simple to understand. Many designers seem to believe the users have the time, opportunity and desire to gain and retain knowledge of a product’s symbols, keystrokes, procedures and other complex details. Nothing could be further from the truth. Indeed, the high level of frustration people experience today stems from this pervasive design mistake. Cell phones, calculators, computers, VCRs, automated tellers, gas pumps, car stereos, voice mail, telephone systems, calculators, access locks and microwaves are dissimilar products, but they often share complex operating environments–difficult to understand and not intuitive. This situation quickly leads to intellectual exhaustion and confusion and, inevitably, to mistakes in operation.

The avionics cockpit environment increases the user’s stress level significantly over many other environments. Survival in the cockpit requires remembering basic flight and equipment issues plus the detailed quirks and intricacies of many pieces of equipment. All of this takes place in a tiny, cramped and uncomfortable space, where operational failure can easily lead to death or injury. It is an environment in which recovery from mistakes can be difficult.

When flight is not a daily task, as with private pilots, the retention of all this system information is poor. The mind purges what it doesn’t need at a particular moment. Eventually, all the illogical systems blur together in the user’s mind and wind up being misused or ignored. I’m constantly amazed of how little of any product’s functionality is used because the user does not understand or remember how to make the features operate.

With increasingly complex digital systems, an unattractive trend emerges: Pilots become unsure of a system’s correct operation, especially while they are under stress, so the system further contributes to that uncertainty and stress. There are too many functions that are poorly identified and implemented, and too much is cryptic that could easily be clarified. The better approach would be to make the system simpler and more obvious in operation. It should be less choked with low-utility features and more optimized to lead the pilot to make the right decision by intrinsic design.

Solving these problems at the equipment level can be done in many ways, including better panel and function markings, better panel layouts and control groupings, fewer abbreviations and acronyms, and more logical and dedicated control operation. For complex systems, I suggest built-in help and teaching functions displayed at system start-up, as well as a manual or electronic "Rolodex" to present the system’s key issues in flight as a quick refresher. A "reset/return to start" function also should be available for those times when the pilot is lost in the system’s operation.

Systems should account for unexpected operations, as well. Every airframe installation has its quirks. Some have implications for flight safety, for example, radio masking in certain directions (fore/aft/sideways) that prevents reliable distance communication, unexpected interference between radio systems, magnetic compass errors when certain equipment is activated, and excessive power loading problems. Recording these issues, and making them obvious to the crew, is critical to avoid problems. This goes beyond flight manual supplements to include some cockpit placarding, recurring training and, in some cases, reworking the internal systems to remove these issues. Maybe even adding a card in that flight Rolodex is called for.

Future systems should query or advise pilots, talk to them and provide verbal cues to help them make choices easily. Systems should help prevent wrong decisions and facilitate fast recovery if they should occur. In the next column, we will dig deeper into the making fast recoveries from abnormal operation.

Walter Shawlee 2 may be reached by e-mail at walter2@sphere.bc.ca.

Receive the latest avionics news right to your inbox