-T / T / +T | Comment(s)

Saturday, May 1, 2010

Pushing Boundaries With New Technology

We took a look at new developments for dealing with brownout scenarios, sound reduction, data management and more.

By Charlotte Adams

It’s not good when a pilot loses situational awareness in a dust cloud near the ground. Brownout is a major threat to helicopters in Iraq and Afghanistan. The U.S. Department of Defense has estimated that 37 percent of helicopter losses there between 2001 and 2008 involved brownout and obstacle strikes. This article looks at pilot brownout aids and other emerging technologies.

Combat unit training scenarios must include a continuum of threat levels designed to train students to avoid, degrade, defeat or destroy threat systems in order to survive. Man-portable air defense systems (MANPADs), present one of the most lethal threats to helicopters since Vietnam. Training to deal with them impossible.

Pilot Brownout Aids

The CH-47F’s hover display, in combination with the aircraft’s digital automatic flight control system, helps to correct for uncommanded lateral and vertical movement in low-visibility conditions. Rockwell Collins’ Common Avionics Architecture System (CAAS) display provides real-time cuing to increase situational awareness and help pilots to visualize the direction and rate of drift. In brownout, the display helps them to correct for drift and “beep” the aircraft down to the ground in controlled one-foot increments.

The “god’s eye view” display presents position, velocity and acceleration information relative to the desired hover point. Position information is depicted by two scalable concentric circles and radar altitude. Velocity and acceleration, depicted with a velocity vector line and an acceleration cue in the center of the display, help the pilot visualize the rate and direction of drift. The pilot uses the cyclic to keep the velocity vector and the acceleration cue inside the hover box.

The system does not feature an active sensor to paint the landing zone. “The pilots need to see where they are landing as they approach the landing zone before it is obscured by dust,” to verify that the area is clear of obstacles, said Doug Schoen, principal marketing manager with Rockwell Collins Government Systems. But the display helps the pilot to precisely hover over a spot and land without drifting, thanks to real-time inputs from the aircraft’s embedded GPS/INS. Schoen was not aware of any system in operational use by U.S. forces—other than FLIRs—that detects and displays obstacles.

Rockwell Collins is also developing “synthetic-enhanced vision” (SEV) for rotorcraft, combining synthetic vision (SV) technology with real-time sensor inputs. It has assembled an SEV testbed embedded in the CAAS architecture and demonstrated its functions. The system provides an out-the-window view of the terrain ahead, including obstacles, with a forward field-of-regard of 56 degrees. The company expects to have the equipment installed on an actual aircraft for internal flight test some time this year.

Active Sensor Input

One solution for brownout available today is Elbit’s Dust-Off, which integrates a number of off-the-shelf components already in use:

• ANVIS/HUD helmet-mounted display (HMD) with day/night modules and head tracker;

• Digital moving map display;

• SWORD (Surveillance and Warning Obstacle Ranging and Display) ladar (laser detection and ranging), which provides a real-time update to the digital terrain elevation database (DTED) throughout the flight, an 8-second warning of flight path obstacles in sharp turns and 12-second warning in straight flight; and

• Digital video recorder and mission data loader.

Introduced in 2008, Dust-Off has undergone two simulator evaluations and one flight evaluation, according to Benjamin Weiser, senior director for U.S. and UK business development for Elbit’s Helicopter Upgrade business line. The ladar was flight tested last year as part of an agreement with the U.S. Army’s Aviation Applied Technology Directorate.

The “god’s eye view”of CH-47F’s hover display presents position, velocity and acceleration information relative to the desired hover point. Rockwell Collins
It is not clear whether any customers are using the whole Dust-Off system operationally at this time, but the company in late March was on the verge of signing its first production contract for the HMD component with 3D reference cues to allow drift detection and correction in low-visibility conditions. The symbology for drift detection and correction is derived from the embedded GPS/INS and other onboard sensors. In the full configuration, the symbols also can be derived from the ladar data, as it is tied to the geo coordinates through GPS/INS. These virtual reference symbols, or icons, which do not correspond to objects in the real world, also provide cues to altitude, attitude, airspeed and angle of approach. We don’t use “highway-in-the-sky” type symbology, Weiser said. “We’ve found that the pilots are so intent on keeping in those square boxes that they disregard everything else in their surrounding environment.” The company also noted that SWORD data can generate a “pure, lean and mean” symbolic presentation of categorized obstacles on the Israeli version of the ANVIS/HUD HMD, “as per the IAF [Israeli Air Force] requirements.” The IAF was preparing for SWORD acceptance test procedure flights in late March.

Dust-Off’s ladar scans the area in a forward-cone field-of-regard of about 100 degrees. In brownout landing mode the beam scans down to the ground to clear the landing zone and identify obstacles. The sensor scans millions of times per minute and each ping is measured in lat/long and elevation. Key information created from the three-dimensional ladar data and entered into the DTED is extracted from the DTED and presented as symbols on the helmet-mounted display, properly located within the pilot’s field-of-view. These include icons representing the location of hills, mountains, towers, and high and low wires. The system can identify 5mm wires at one kilometer and a high-power line at two kilometers, Weiser said.

Key to Dust-Off is its ability to add real-time terrain and obstacle data to the pre-stored terrain elevation database. Approaching the landing zone, a helicopter may encounter “all sorts of hustle and bustle—trucks coming in, refueling tanks,” Weiser said. The laser radar scanning and presentation capability gives pilots confidence that they are seeing the real-time scene, not a database rendering that is a month old. Elbit is also providing an additional, very precise radar altimeter to “ensure, as the pilot gets close to the ground, that the [obstacle] symbols are attached to the ground,” at the exact location, not floating above or beyond it. Dust-Off’s monocular HUD—positioned over the pilot’s right eye—provides a 32-degree circular field-of-view. Elbit has experimentally projected map data on the HMD, but “the jury is still out whether it’s too much of a soda straw vision,” too narrow a field-of-view for so much data. Besides the daytime HUD, Elbit provides a night solution that attaches to one of the objective ends of the night vision goggles.

Sandblaster integrates four key technologies: fly-by-wire flight controls with point-in-space approach capability, millimeter-wave radar, detailed digital terrain knowledge, and advanced cockpit symbology. Rockwell Collins
Honeywell

Honeywell is deeply involved in military and civil R&D concerning low-visibility operations. The company took part in the now-complete Defense Advanced Research Projects Agency (DARPA) Sandblaster program, which focused on the desert brownout landing problem. Team leader Sikorsky provided the automatic flight controls, Honeywell contributed synthetic vision and a “sensor-driven, localized external evidence knowledge grid,” and Sierra Nevada supplied a 94-GHz millimeter-wave radar. The evidence grid is a 3D virtual model that overlays the sensor returns on top of the terrain database.

A representative Sandblaster display shows a perspective view the pilot would see, as if out the window, approaching an obscured landing zone. The landing zone is depicted as a cyan circle whose size is the same as the helicopter’s blade space. The triangle in the center represents the tricycle gear landing wheels of the Black Hawk (the JUH-60A Sandblaster demonstration vehicle). Using the Sikorsky flight control system and a control on the cyclic, the pilot moves the cyan circle around so that it’s not touching any obstacles that are painted yellow or red, according to their threat level. (Yellow indicates that the object is two feet high; red indicates that the object is four feet high.) The dark green circle below the cyan circle represents the sweep of the blades at the aircraft’s current position. A boulder on the right edge of this area is colored green because it is less than two feet high. The millimeter-wave radar provides data accurate from one to three feet.

Honeywell now is focusing on a Sandblaster follow-on program being planned by DARPA and the Air Force Research Lab (AFRL) to further develop the technology for forward-flight applications, beyond a landing-only solution.

Separately, the company expects to field a dedicated helicopter Cable Warning and Obstacle Avoidance (CW/OA) system in late 2012, built around its certified SV software, proven Enhanced Ground Proximity Warning System (EGPWS) terrain database and evidence grid.

It is also developing a less than 20-pound millimeter-wave radar, slated to roll out before the end of 2012, which will be offered as a CW/OA option, said Gregory Walters, marketing manager for crew interface. The product is designed to detect 3/8-inch cables at 2,000 feet with an 8-second response window, assuming speeds of 80-90 knots.

Honeywell is also working with a European partner to potentially integrate a lidar (light detection and ranging) sensor with the CW/OA.

CAE is developing a somewhat analogous real-time sensor-aided, augmented visionics system (AVS) as a brownout aid. Built on the company’s SV system, which sits atop CAE’s common database (CBD), AVS allows real-time updates of the synthetic world by sensors such as lidar, IR or cameras. Although AVS is sensor-agnostic, “lidar today is the most dust-penetrating sensor we know of,” said Adolfo Klassen, CAE’s chief technology officer. The company has demonstrated AVS with the Neptec Design Group’s lidar and also with a FLIR.

Passive Solution

TerraMetrics has been working with AFRL, using two visible-light cameras for stereoscopic ranging. The technique builds a real-time model of the landing zone as the aircraft approaches it, said Greg Baxes, company president. The data can be displayed to the pilot on a HUD or LCD. Most importantly for this project, the cameras give off no emissions. IR cameras could also be used.

Stereoscopic ranging uses pictures of the ground to compute the distances of objects in the image. That data is then projected into a terrain model, which is used in an SV system to display a real-time, 3D view to the pilot. The company is halfway through the second phase of its AFRL work and is conducting aerial testing on an early prototype using non-aircraft means.

The “store and remember” system starts taking images 1,000 feet out. “Unlike a laser that is working throughout the landing, we’ve done our hard work” before brownout starts, at about 100 feet or more off the ground, Baxes said. At that point you have the model and know the location and the attitude of the aircraft.

[X] Dismiss Ad
Live chat by BoldChat