A comparison of the A330neo and A350 provided by Airbus.
Editorial note: If you're interested in contributing an original opinion piece to be published by Avionics International, check out our guest submission guidelines.
Software can be found in every corner of the aviation industry, and it’s a critical component of anything related to safety in the cockpit from monitors and displays to navigation and communication systems. On an airplane, anything software-related that is not part of the passenger entertainment system has to meet specific safety regulations and must be certified according to DO-178C — Software Considerations in Airborne Systems and Equipment Certification — before it goes aboard an airplane. DO-178C is recognized by the U.S. Federal Aviation Administration and its European equivalents. The guideline also covers other aviation-related devices, such as drones.
The four-step process to attain DO-178C certification involves four Stages of Involvement (SOIs), beginning with a lot of documents and forms to fill out in SOI1. In SOI2, companies must have their software coding and architecture verified, and in SOI3, the software must be tested. SOI4 is the completion stage, in which a company is required to show all the evidence that its software has passed all the tests and that every other stage of the process was completed correctly.
The certification process can take up to 10 years or more, depending on the complexity of a system and how many subsystems it has. Along the path to meet DO-178C, there are some best practices that manufacturers can keep in mind to ease the process.
Best Practice 1: Understand the safety-critical level of the software being certified
Of course, not all software has to meet the same level of safety requirements. There are four Design Assurance Levels (DALs) that specify how critical a piece of software is in relation to the safety of the airplane, from the least critical (DAL D) up to the most critical (DAL A).
One example of DAL D would be an ice-breaking system for drones. When a drone is at a very high altitude, it can get ice on its wings, so there’s a system that breaks the ice. This kind of a system is not considered critical because the drone doesn’t have any passengers, and it won’t crash but rather make an emergency landing if there is too much ice on the wings.
Higher-level systems like maps and displays fall under DAL B, while DAL A level software includes navigation systems. The higher you go up the safety-critical ladder, the more tests there are and the more rules to follow regarding development. All this also influences the length of time it takes to get the software certified and the cost of the process.
Best Practice 2: Automation streamlines the testing process
As one would expect, the rules for achieving DO-178C certification are very strict. Therefore, it’s difficult to find faster or easier ways of going through the process. That said, advances in automation for testing, along with automated tools, have allowed for more expedient testing and for continuous testing as well. Every tool used in the process must also be certified, so being able to cut the necessary testing time is crucial to getting a software or system into the marketplace faster.
With manual testing, it could take several months to test just one version of software for a system. If one line of code is changed, testing has to start all over again. Automation has sped the testing process up so that it now takes only several hours to test one version.
Best Practice 3: Expect constant retesting
No matter how small of a change something might seem, the fact is, it could lead to 100 different requirements that need to be changed. And each of those changes will require software to be retested. Requests for changes can come from anywhere.
Sometimes the pilots or the airlines themselves will request a change, such as altering the font in all the apps. If the font is changed — even to increase or decrease the size by a fraction of a point -- this will necessitate retesting all the maps, because the new font size creates an entirely different image on the screens. Changing the size of the displays? Be prepared to retest the entire system.
Some changes are inevitable. Hardware gets old and has to be replaced. This happens a lot in the aviation industry. When it does, the software has to be updated, and this requires everything to be tested all over again. If software that has been tested for use in one aircraft is put on a different plane, or is being adapted from a smaller drone for use with a larger drone, this also means retesting the software in its new platform.
Best Practice 4: Ensure the traceability of the certification process
As with the testing of the safety of anything -- from pharmaceuticals to aviation software -- the test must be thoroughly transparent and examinable. An auditor should be able to trace the testing of the system from end to end. Therefore, a developer must ensure the traceability of its software from the code, to the requirement, to the verification, to the test, all the way through the process. An auditor must be able to see exactly what was done and how at every stage of the certification process.
A simple certification takes about two years. That’s the minimum. Going through the process, developers need to follow the guidelines closely and keep traceability tight and manageable. Testing must be repeated until the system meets the DO-178C standard, otherwise it won’t go onto the airplane or the drone.
Whether your software is safety critical or not, changes to the code will eventually be necessary, whether due to inevitable technological evolution, necessary hardware upgrades or customer change requests. These code changes will need to be tested, no matter how small or insignificant they seem. This level of deliberate, time-consuming testing is indispensable when considering the inherent life or death stakes within aviation and aircraft technology.
Eli Dvash is Senior Manager of the Safety & Aviation Software, Defense Division, for Qualitest Group. Eli has an extensive background in testing; prior to aviation software, he managed the hardware division at Qualitest Group, specializing in electromagnetic radiation testing, environmental testing, and materials testing.