Sebastien GIULIANO, ATTOL Demonstrator Project Leader for Airbus explained how the company achieved its first vision-based automatic takeoff in the modified A350-1000 pictured here. Photo: Airbus
Using a combination of image recognition technology and flight control computer modifications, Airbus successfully performed its first ever fully automatic vision-based takeoff with an A350-1000 in December. Avionics International recently caught up with Sebastien Giuliano, project leader for the Autonomous Taxi, Take-Off & Landing (ATTOL) project at Airbus to learn how the vision-based test flight was made possible.
The ATTOL project is a technological flight demonstrator initiative that first began at Airbus in June 2018, as part of the French airplane maker’s goal of understanding the impact of using increased autonomy on aircraft. During a flight test on December 18, a crew including two pilots and three flight test engineers performed a total of eight take-offs and landings to achieve a major milestone for ATTOL to use image recognition technology in place of an Instrument Landing System (ILS) to perform an automatic takeoff.
According to Giuliano, the crew used several system modifications and camera upgrades on the A350 to perform the test flight.
“Avionics upgrades were limited to the flight control computer, and additional modifications were linked to the installation of cameras and additional computing capabilities linked to those cameras. This was done on an on-boarded demonstration platform which was linked to the modified Flight Control Computer but was not an avionics grade platform,” Giuliano said.
Airbus has not released the name of the company that supplied the image recognition technology for the flight, however Giuliano describes it as “state of the art computer vision and hardware techniques that can be found in different industries, including the automotive industry.” Engineers from several different divisions across Airbus participated in acquiring the image recognition technology and installing it on the aircraft.
“The cameras were adapted to our use case and environment. Those development and adaptations were developed by a team of Airbus engineers from different divisions – Airbus UpNext, Airbus Commercial, Airbus Defense and Space and A³ – and some support from ONERA as a subcontractor mostly for data fusion of existing parameters with vision-based parameters,” Giuliano said.
Giuliano also emphasized that the flight test is different from what could also be described as an automatic takeoff, where the positioning of the aircraft relies solely on the use of an ILS. An in-cockpit video of the pilots taking off also directly points to their hands not being on the controls as they taxi down the runway and prepare to takeoff.
Yann Beaufils, one of the two test pilots that participated in the flight test described the process in a statement published by Airbus.
“While completing alignment on the runway, waiting for clearance from air traffic control, we engaged the auto-pilot. We moved the throttle levers to the take-off setting and we monitored the aircraft. It started to move and accelerate automatically maintaining the runway center line, at the exact rotation speed as entered in the system. The nose of the aircraft began to lift up automatically to take the expected take-off pitch value and a few seconds later we were airborne,” Beaufils said.
The vision-based automatic takeoff was the latest achievement in the Airbus A350 program, after Airbus completed the delivery of the first A350 featuring touchscreen displays to China Eastern Airlines. By mid-2020, the ATTOL project team hopes to achieve its next milestone: automatic vision-based taxi and landing sequences.