ATM Modernization, Business & GA, Commercial, Military

Human Factors in Avionics Design

By by Charlotte Adams | November 1, 2013
Send Feedback


Human factors engineering applies our understanding of the abilities and limitations of the human mind to the design of aircraft cockpits by studying the interaction of the pilot’s mind with proposed avionics systems rather than focusing on the avionics alone.

Human factors engineering teaches that human machine interfaces (HMI) should be as intuitive and natural, as simple and direct as possible. And human factors considerations are becoming more and more central to the overall design process. “The only way forward is by re-centering the [cockpit] design around the pilot’s need, using cognitive engineering,” says Sylvain Hourlier, Thales’ human factors senior expert and design authority.

Basic tenets of human factors, from an avionics perspective, include being intuitive in order to simplify tasks and reduce pilot workload, says Bill Stone, Garmin’s senior business development manager. “The ultimate goal would be to eliminate the reliance upon memory and memorization,” he says.

Thus, despite all the magic of computers, a simple button or knob is sometimes the best solution. For setting barometric pressure or adjusting the heading, “there’s really no better human factors than the rotational knob,” Stone says.

Rockwell Collins stresses the understanding of the pilot’s “mental model” of the system — his understanding of how the system is organized, how it works. This view may be complete or incomplete — a pilot may not need to know all the engineering details of a system in order to fly the plane. But in designing avionics, it’s important to understand cognitive limitations because these impact attention, workload and decision making on the flight deck, says Debbie Richardson, principal systems engineer, commercial systems avionics, at Rockwell Collins.

Two of the criteria Honeywell uses to evaluate proposed innovations are operational benefit and usability, says Ken Snodgrass, Honeywell’s vice president of integrated cockpits. Why (unless a change is mandated) should you add something new to the cockpit unless it buys the user tangible benefits such as lower minimums or entry into new airports? And even if it’s beneficial, why add something that isn’t usable?

A Little History

Before the human factors vogue cockpits had many dials, each with a single piece of information. The Concorde took this trend to an extreme, requiring four pairs of eyes to monitor all the gauges, Hourlier recalls.

However, there were six main dials in most older aircraft — the “six pack” of so-called steam gauges by which the pilot flew the airplane. When the industry first went to glass cockpits, all it really did was put the same steam gauges on glass, Snodgrass says. But it was still natural to the pilots because that’s exactly how they had been trained.

One breakthrough spearheaded by Honeywell was the development of “windows” in the cockpit, Snodgrass says. By this he explains that, instead of having one big display like a television, you could split it up and present different things at the same time, such as the map function and a checklist. The pilot and the copilot could have different presentations.

Another Honeywell breakthrough with Primus Epic was graphical flight planning, Snodgrass says. The company added a mouse — a cursor control device — so that instead of punching a lot of buttons on the multifunction control display unit (MCDU) and trying to put different position information on the display, you could click on a spot and insert a waypoint or a holding pattern. You can click on an airport and get the frequencies and runway information, he says.

Growing Complexity

As avionics evolved, more information became available and was consolidated, which increased complexity, Richardson says. Thus some information is less visible, so that pilots have to dig for it, she explains. “Depending on [the nature of] the function, we’ll try not to bury things. Anything we can put on the top level, we will put on the top level.” Controls are prioritized, so that the most critical or most frequently used ones are put at a higher level, she adds. And no more than three menu steps or layers are required, adds Derek Jensen, senior engineering manager, Pro Line Fusion human factors, at Rockwell Collins.

Honeywell asks the question in a different way. How many button pushes away is a function? Honeywell design teams use a structured evaluation process called the functional allocation matrix. Engineers and designers consider how information should be presented to the pilot in the event of hydraulic failure, for example. They look at what needs to be up in front of the pilot, what needs to be one button push away, or what could be five button pushes away, Snodgrass explains.

At the higher reaches of complexity is the ubiquitous MCDU. This important device, which interfaces the pilot to the flight management system (FMS) and other systems, requires typing a lot of data on a small keyboard and viewing data on a relatively small screen. But that’s not the biggest challenge, according to an industry paper on the device. The researchers pointed to pilots’ need to reformulate certain tasks before keying them into the device — increasing workload — and pilots’ overreliance on memory to operate the MCDU.

Introduced as the CDU, it started life performing a single function—as the interface to the FMS. But as more functions were added to the flight deck, there wasn’t room for more controllers so the CDU became the MCDU. That’s where the problem of “mode confusion” comes in, Stone says.

The complexity of inputting information through the MCDU prompted a search for a more intuitive interface. The next step appears to be touch-sensitive displays.

Touchscreens

Rockwell Collins is
developing is a touch-
sensitive primary flight
display (PFD) for
Pro Line Fusion,

In its new G5000 cockpit, Garmin solves the problem of the difficult MCDU interface with touch screens. The company is the first to use this type of interface on the flight deck, Stone says. Not only has it added touch screens, it has taken off the MCDUs. And the touch screen is used not only to interface with the FMS but also to manipulate and control other systems such as environmental, fuel, ice protection, hydraulics and brakes, Stone says. The first aircraft had not yet been certified at the time of this writing but was expected in the near future.

The touch-sensitivity feature was located on pedestal displays rather than on the main displays. This was a matter of deliberate choice involving human factors. It would be difficult for a pilot to stretch out his arm to a multifunction display (MFD) and perform precise muscle movements with his finger to operate a touchscreen, especially in turbulence. “We believe that’s actually poor human factors,” Stone says. The pedestal is closer to the pilot’s body, so the touchscreen is less challenging to operate. And it’s easy to brace one’s hand against the pedestal if there’s turbulence.

Touch control plays to human strengths, as “we’re optically centered beings,” Stone says. “If I have to abstract and try to remember that I have to go to page 4 and menu item 7, it requires memorization, and that increases workload.”

When things become busy and the pilot needs to focus on high-priority tasks, the ability to think through and remember things becomes diminished, Stone explains. But “optical cognition still remains razor-focused,” he says. “When you look at an icon, you are able to cognitively understand [it]; you don’t have to do translation.” That’s why Garmin thinks the touch screen is a pretty superior user interface, he says. It’s an array of virtual buttons.

Honeywell, Rockwell Collins and Thales are also looking at touchscreens. Honeywell expects to have a touch-sensitive general aviation navcom system, the KSN770, out in a couple months, Snodgrass says. It will be a pedestal-mounted system with both hard buttons and touch sensitivity.

Rockwell Collins is developing is a touch-sensitive primary flight display (PFD) for Pro Line Fusion, another first. As part of its research on the project, the company measured the time pilots took to do some simple procedures, such as changing a heading or selecting something on the PFD, Jensen says. It found that it was almost 50 percent faster to perform these operations by touch than by the normal way of entering data into a control panel. The reason is that you’re entering data directly on the display rather than indirectly through another device, Richardson says.

Rockwell Collins is still developing the touchscreen feature, but a company presentation on YouTube shows how a pilot could change altitude, speed and heading settings at the touch of a finger. Such a feature also promises to reduce avionics size, weight, power and cost. The company expects the touch-sensitive PFD to appear first as a retrofit on King Air.

Nevertheless, touch is not the only way of manipulating information on the avionics, Richardson says. “We still have traditional knobs, cursor control and button presses.”

There are many things to consider with touch screens, Snodgrass says. Uncommanded operation is a key concern. It is possible that touch screens could increase confusion because they are software-driven and easier to activate than fixed and rigid knobs and buttons.

Synthetic Vision

Synthetic vision is another example of human factors that make sense from both operational benefit and usability perspectives. Honeywell has been doing a lot of flight testing and data collection to try to prove to FAA that synthetic vision can provide better accuracy in landing that a head-up display (HUD) would give you.

The company has done some 350 approaches and landings in a real airplane, alternating the use of the HUD and the synthetic vision display. The HUD is limited in that it gives you information that is right in front of the nose of the aircraft, Snodgrass explains. If you’re heading for the airport but you’re in a crosswind, the HUD may be looking off to the side — not at the runway. If you can get equivalent safety without another large and expensive addition to the airplane and can get operational benefit similar or better than a HUD, why not go with synthetic vision? Also the synthetic vision display is so realistic that the heads-down and heads-up views are almost identical, Snodgrass says, so that the transition is rapid. Honeywell is also engaged with FAA to obtain lower minimums for aircraft equipped with synthetic vision.

Receive the latest avionics news right to your inbox