Incorrect weight and speed data can be potentially deadly, especially when the airplane’s computer accepts erroneous information without an error check–rather like a word processor without a spell checker.
What a difference a digit makes. In the case of a March 12, 2003, tailstrike on takeoff at Auckland, New Zealand, the first officer committed what is known as a transcription error. He entered 247 metric tons instead of 347 tons on the "bug card," a reference used to list takeoff and landing data. Using a weight that was 100 metric tons too low, he manually calculated a takeoff thrust setting and V-speeds excessively low for the airplane’s weight.
�The captain then entered the incorrect manually calculated V-speeds into the flight management computer (FMC), ignoring the substantially higher V-speeds and the higher thrust setting it had computed and was displaying. These higher speeds were based on the correct takeoff weight the captain had entered into the FMC from the load sheet. Rather than reconciling why the V-speeds on the bug card and on the FMC display were so different, the bug card speeds were inserted into the FMC, which accepted them.
Result: a tail that dragged spectacularly in a cloud of smoke as the airplane struggled to break contact with the ground and climb into the sky below stall speed. The crew of Singapore Airlines (SIA) Flight SQ286 was able to get airborne and, fearing a fire in the tailcone, completed an emergency return-to-field overweight landing–the 9-hour planned flight from Auckland to Singapore by this time all but forgotten.
The airplane was out of service, under repair, for weeks. The transposed digit led to an unsafe takeoff that could have ended with an airplane careening off the runway, greatly endangering the 369 passengers aboard.�
The cockpit crew was faulted mightily in the post-event investigation for not catching the weight error. There is a larger issue: whether FMC designs have kept pace with the demands placed on them and, more specifically, whether their built-in defenses are adequate to catch erroneous data inputs.
A recently released report of New Zealand’s Transport Accident Investigation Commission (TAIC) reveals details of events leading up to the tailstrike at Auckland. The TAIC inquiry shows how crews under time-pressure may be prone to distraction and mistakes, and how even multiple lines of defense can be breached.
The crew consisted of the captain and two first officers–the second of whom will be referred to as the relief pilot. As an example of how one safety problem can affect another, the captain noticed that only 4.5 tons of fuel were loaded into the center wing tank, although a minimum of 7.7 tons was required to ensure that the pumps were covered in fuel (a minimum fuel quantity designed to prevent an explosion of flammable vapors).
The time taken to add fuel delayed departure. Fuel weights had to be adjusted and the load sheet modified. In the process, though, the first officer did not catch that he’d marked down on the bug card a takeoff weight (TOW) that was 100 tons below that written on the load sheet.
Per procedures, the captain then entered the thrust and speed values calculated by the first officer into the (FMC). The captain later told investigators that he checked the zero fuel weight (ZFW) and TOW displayed by the FMC and found them in agreement with the values on the load sheet, from which he’d entered them into the FMC. The FMC calculated its own takeoff speeds, and the captain did not notice that they were significantly higher than the ones provided by the first officer (based on a TOW 100 tons lower than it should have been). Rather, he input the thrust setting and speeds provided by the first officer, basically over-writing the FMC. Despite the significant difference in speeds, the FMC accepted the lower values.
Enter related factors–the departure delay and the power of distraction. The relief pilot, who normally would cross-check the bug card and computations, did not do so, being occupied at the time explaining the departure delay to Singapore Airlines’ station manager at Auckland.
With the captain as pilot flying (PF), the first officer called "rotate" as the airplane reached 130 knots on its takeoff run. With insufficient speed to generate lift, the airplane remained on the ground, dragging its tail on the tarmac for a good 1,500 feet (457 meters) and veering to the right before finally becoming airborne in ground effect.
The stick shaker activated, warning of imminent stall as the airplane began its struggle into the sky. The captain ignored it, regarding it as a spurious activation. Three seconds after stick shaker activation, the pilots received an auxiliary power unit (APU) fire warning. The APU is located in the tailcone, and when the tailcone was dragging on the runway, the damage to the fire wire loop triggered the APU fire warning in the cockpit.
With an emergency declared, the airplane climbed slowly to about 1,000 feet. The correct V2 initial climb speed (172 knots) was not achieved until 64 seconds after lift off, when the aircraft reached this altitude. Had an engine failure occurred during this period–which is why V2 is calculated–the airplane probably would have been lost. Given the warning of a fire in the damaged tailcone, the crew prudently decided not to dump fuel and made an overweight landing.
After the damaged airplane landed, the cockpit voice recorder captured this telling exchange among the three pilots, dealing apparently with the TOW:
Captain: "Should be a three."
First officer: "Takeoff weight?"
Captain: "Yeah."
Relief pilot: "Gosh–I should have checked it."
The captain was relatively new to the B747-400 but had more than 5,600 hours in the A340, in which he would have seen TOW figures regularly that were closer to the first officer’s erroneous figure. Often, during times of stress (center fuel tank minimum quantity error, in this case) people revert to their original or predominant training, experience and understanding. It is also conceivable that having dealt with one mistake, the crew became slightly oblivious to spotting the other.
The simple transcription mistake, therefore, breached six levels of defense:
First, the captain did not verify the TOW on the bug card.
Second, FMC-computed V-speeds were discounted. As the TAIC report noted, "They knew they were on a direct flight to Singapore with a planned flight time in excess of nine hours with a planned fuel burn over 100 tons."
Third, "From simple cognitive reasoning and subtracting 100 from 247, the result gave a landing weight at Singapore significantly less than the empty weight of the airplane itself," The TAIC observed. "Even though the delay was only about 13 minutes, it could have been sufficient to pressure the pilots to unconsciously hurry through their procedures to minimize the time loss," the TAIC report surmised, regarding the absence of simple cognitive reasoning.
Fourth, the relief pilot did not verify the entries on the bug card. As the TAIC report said, "Time pressure could have contributed to the pilots’ non-detection of errors."
Fifth, the operator did not require bug card data to be reconciled against FMC-generated V-speeds, i.e., a potential fifth line of defense did not exist.
Sixth, here, too, a potential line of defense in the FMS itself was missing. The TAIC report said, "Had the FMS been programmed to challenge, or in certain cases not accept, erroneous or mismatched entries, then a valuable final defense against incorrect entries would have existed."
Another tailstrike incident involving an Air Canada A330 on takeoff at Frankfurt in 2002 is almost an identical predecessor to the Singapore Airlines event at Auckland. Together, the two cases suggest that automation without discrepancy checking can be an insidious hazard.
This may be a situation where additional automation is in order. Spelling errors mistakenly typed into one’s word processor appear with a squiggly red line underneath to capture the writer’s attention.
It would seem that an airplane’s FMC could be programmed to reject, highlight, or just "flash for a confirming second ENTER action" any data outside of a set boundary. Consider such a feature a final barrier against embarrassment. (The full TAIC report may be viewed at www.taic.org.nz/aviation/03-003.pdf.)
David Evans can be reached by e-mail at [email protected].