-T / T / +T | Comment(s)

Saturday, June 1, 2013

Synthetic Vision: Seeing What You Should Be Missing   

Testing out Honeywell’s synthetic vision on a company-owned AgustaWestland AW139 during a flight around New York City.

By Ernie Stephens, Editor-at-Large

 

Think back to the first time you saw a multi-function display (MFD) screen in a cockpit. For me, it was around 1987, when I saw a McDonnell-Douglas F-15E Strike Eagle on static display at an air show. The high-tech fighter had a head-up display and thermal imaging system, too. That was all exceptionally impressive to me, a lowly student pilot with just a few hours in a Robinson R22. It might have been a couple of years after that before I saw a “TV screen” in anything that was not a military aircraft.

Most avionics historians, however, cite 1982 as the point in time when glass cockpits made the jump from military and space applications to commercial aircraft. And after relatively inexpensive microprocessors arrived on the scene in the late 1990s, even two-seater piston airplanes had GPS with colorful, digital images to help aviators find their way.

But have you ever noticed that cockpit technology – at least the really cutting-edge stuff – commonly found aboard corporate airplanes seems to take much longer to reach the rotorcraft world?

Many believed that helicopter users represented such a small slice of the aviation pie, glass avionics for rotorcraft didn’t warrant a lot of research and development. After all, it was designed for high-speed, high-altitude, cross-country travel in jets and turboprops. Helicopters, on the other hand, were these slow, utilitarian things used for short hops close to the ground.

Well, it’s a new day! The sophisticated avionics suites once reserved for airliners and top-of-the-line corporate jets are now standard fare aboard a wide range of helicopters. You can even get an after-market, digital display suite for a Robinson R22!

In an effort to prove that the newest and most exciting technology is making its way to the helicopter community, Morristown, N.J.-based Honeywell International invited Rotor & Wing to come fly one of its latest helicopter avionics prototypes. The engineers call it the “Smart View System” – SVS, for short.

SVS is what the tech world refers to as “synthetic” technology. It starts by taking data on the locations of streets, waterways, power lines, terrain and buildings, and uses the data to produce a colorful, digitized, moving-map depicting of that information. (That’s the “synthetic” part.) But Honeywell’s system designers went one step further by precisely superimposing an infrared (IR) image over that digital map. The result is an MFD image that displays the environment ahead of the aircraft based upon what mapping data says ought to be there, and what the thermal imager actually finds there. For example, a data-driven rendering of a river will show a blue band weaving through a city. But since the information on its shape is just an average that cannot show how its dimensions have changed with the tides, the image may not accurately represent its true appearance at any given time. The IR camera mounted under the nose of the aircraft, however, is delivering a thermal look-see that will show the waterway’s actual dimensions at that exact moment. That marrying of data and IR information is what Honeywell calls a “Combined Visual System,” or CVS.

CVS was actually certified for fixed-wing aircraft several years ago, and has since been hailed as a valuable tool, particularly at night when uncharted towers can pop up out of nowhere, and deer can be found strolling onto a runway just as a pilot is about to land.

“We’ve been doing research and development work with helicopter SVS for about seven years,” said Dr. Trish Ververs, Honeywell’s crew interface and platform systems research scientist. “But you can’t just take this technology, throw it into helicopters and expect it to work.”

The reason for the delay between the FAA’s acceptance of the equipment aboard airplanes and its approval for use in helicopters is a simple matter of aerodynamics: Helicopters don’t move like airplanes.

The data-generated and IR images presented to the pilot on the cockpit’s MFD must show what’s in the aircraft’s flight path. That’s a snap for an airplane, since with few exceptions its nose is pointed in the same direction that it’s going. But in order to meet the tough standards required for FAA Part 29 (airworthiness standards for transport category aircraft) approval, engineers have to prove that the data and thermal images provided by the CVS will show what’s in the direction of flight, even when the nose of the helicopter is, let’s say, pitched up 10 degrees while at a steep angle of descent. Honeywell has cracked the code for doing that in rotorcraft, and as of this writing, is completing the tests needed to get it certified.

I was taken to Honeywell’s corporate flight line at Morristown Municipal Airport (MMU), and over to the company-owned AgustaWestland AW139. From all outward appearances, N139H was like any other AW139, except for a Astronics Max-Viz 1500 IR camera in an enclosure mounted under its starboard chin.

Honeywell’s lead helicopter pilot, Marc Lajeunesse, parked me in the left seat and he took the right. After a routine startup, he set my left MFD to display the CVS system, and the right one to show a “god’s-eye view” integrated navigation screen. The three remaining screens – one center and two in front of him – were the standard flight instrument, system monitoring and navigation displays found on Honeywell’s Primus Epic system.

Once airborne and tracking through New Jersey towards the southern tip of Staten Island, the SVS – that’s the data-only map – provided me with a digital, out-the-front view of the roads, hills, power lines and towers in my path, along with the usual flight information, such as airspeed, altitude and heading. Located near the center of the screen was a small, green circle with one line coming out of the top and one on each side. Fighter pilots will recognize it as the flight path vector (PFV), which uses the aircraft’s pitch, roll and power to show where the aircraft will go if left in that profile. So, had I reduced power and begun a descent, the PFV would have slipped off the horizon, and gradually moved towards to the bottom of the screen to indicate where we would end up. Had I banked left, it would have moved left, and so on.

Honeywell’s engineers also drew a white “zero pitch” line across the screen. Any feature that protrudes above this line is at an elevation that will eventually put you in harm’s way if you maintain that flight profile.

As we reached Lower Bay near Staten Island and turned north towards the Varrazano Narrows Bridge, Lajeunesse asked me to compare what I saw of the western shoreline through my window to what I saw on the SVS. As with most systems, the digital image was just a rough calculation of where the water stopped and land began.

“Now, look at this,” said Lajeunesse, as he switched the screen to CVS mode – the one that uses the IR camera.

Instantly, a transparent IR overlay centered itself on the screen, and used the heat signatures of all that lay in front us to show what was actually there. The moving, digital image was now enhanced with black, white and grey outlines of every feature, from private piers jutting out from the shore, to small boats headed for New York Harbor.

“Now, what if it was dark and you had to make an emergency landing?” Lajeunesse asked. I was way ahead of him. The shoreline was now crisp and precise, and not just a casual, featureless curve. If it had been nighttime, I would have been able to see exactly where I could have made an emergency landing without getting wet, or ditched without being split in half by a cargo ship.

Farther up the river, as Lower Manhattan was coming into view, all of the bells and whistles of CVS came into play. As the IR image gave me details of small boats, buildings, and even other aircraft, the enhanced ground proximity warning system (EGPWS) began identifying potential obstacles.

The EGPWS finds obstructions, and then does a split-second collision evaluation. If it decides that your helicopter is lined up to hit something – in our case, the skyscrapers near Battery Park – the screen will outline the structures in yellow to indicate that at your current speed and trajectory you are 30 seconds from hitting it. If you do not alter your course, altitude or speed to avoid it, the color will change to red, letting you know that you are now 20 seconds from an unfortunate encounter. That’s good news when visibility poor.

After flying up the Hudson River to the George Washington Memorial Bridge, then reversing course to run the New Jersey shoreline towards Hoboken, it was time to see what SVS and CVS had to offer when flying approaches.

Lajeunesse received approval to shoot a visual to Runway 5 back at Morristown. Once lined up 4 nm out, I compared the view out the window with what the synthetic information was telling me.

Even from as far out as the final approach fix, the SVS drew a blue, diamond-shaped target on the end of Runway 5. Without the IR image on, I was still presented with a colorful, GPS-like display of the airport environment. With the CVS up, the screen showed me IR images of houses, roads, automobiles, and distant towers. Either way, all I had to do was adopt and hold an attitude and airspeed that would put that green PFV circle inside of the diamond, and that’s exactly where I would end up. It’s redundant when the sun is up and VFR conditions prevail. But what if it was dark, and the landing area was an unimproved hole in the trees, unlit heliport, or remote medevac scene that the pilot was not familiar with? The crew would be able to get an excellent look at almost every hazard before committing to a landing, plus be able to judge their descent path above featureless terrain.

Before calling it a day, Lajeunesse asked the tower if we could have Runway 13/31 for a little while. Once approved, we taxied to the approach end of 13 so I could put the CVS to one last test.

You see, synthetic vision has been aboard commercial planes for over 10 years. And while jets are quite fast machines, they’re only swift going forward. Helicopters, as we all know and love, move forward, backward, sideways, and can spin on their yaw axis at pretty impressive rates. Consequently, digital and IR images can be useless if they cannot refresh themselves quickly enough to keep up with the aircraft.

The Astronics Max-Viz 1500 infrared camera is mounted in a fixed housing under the chin of the helicopter. Photo by Ernie Stephens

 

So, while on the closed runway, I asked Lajeunesse to put the AW139 in an in-ground-effect hover, and stomp in some pedal. As he did, my eyes were glued to the display. Even at a pretty robust yaw rate, both the digital data image and the IR view kept up with us. The results were the same when he executed an exaggerated quick stop.

That was it for me, so we parked the ship back at the company hangar.

Make no mistake about it; Honeywell’s SVS/CVS synthetic avionics gear seems to have made the transition from the fixed-wing world to the whirly-world. It manages to keep its camera pointed where it ought to be, gives the pilot a ton of useful information, and can be turned off and on as needed. It takes the color, database-driven map many of us have grown to rely on, and make it even more useful by adding a sharp, IR overlay to the screen. Once the FAA certifies it – and I suspect that it will – it should find lots of fans.

Ververs still calls the helicopter version of all of this a “work in progress.” Many of these features, “we would expect to transition into the eventual product,” she reported. “But some of them may not.”

Related: Avionics News 

[X] Dismiss Ad