Military

HUD/EVS: Beyond Infrared

By by Charlotte Adams | August 1, 2004
Send Feedback

The Sensors Directorate has planned data collection exercises and analyses to help decide which modes to develop under AALC’s sensors option.

The directorate’s approach to 3D would estimate an object’s height by combining two techniques: monopulse processing to locate the height of the center of an object and extent measurement to estimate the elevation of the top and bottom of an object. For a 100-foot (30.5-meter) -high cell phone tower on a 100-foot hill, an ideal monopulse measurement would place the "centroid" (center) of the tower at 150 feet (46 meters) above airfield elevation. Extent measurement processing would estimate the top and bottom of the tower to be 200 feet (61 meters) and 100 feet above the airfield, respectively.

A 2D radar, assuming all returns at airfield level, would show the cell phone tower on the HUD as being below the ground. But a 3D radar would show its "entire vertical extent." The image displayed on the HUD would be accurate, notionally, to about 3 meters (10 feet) at 1 kilometer (0.6 mile).

The presentation of the 3D radar image on the HUD probably would look like a line, but it would be processed radar data, not a synthetic icon, Harrington says. And the display would take into account the level of power in the radar return. (On a monochromatic HUD, the variation in intensity of the radar returns creates the display.)

Monopulse processing, a technique that has been applied to target tracking and terrain following applications, requires two antennas and two receiver channels from which to measure the amplitude or phase difference of arrival of two radar signals.

The Sensors Directorate also plans experiments using extent measurement techniques. AFRL will use a frequency-modulated, continuous-wave MMW radar, taking advantage of the sensor’s frequency diversity to take multiple measurements off the same objects in a scan.

Further Experimentation

Another experiment this year will apply SAR techniques to a MMW radar, an area where little has been done. The interferometric SAR would create a high-resolution 3D map of the airfield, which could be presented head-down to the copilot for visual assessment of potential obstructions — although this would be only a partial solution.

Ultimately, the Air Force wants the processing in the background and a simple presentation of the results on the HUD, Harrington says. If the SAR approach can characterize objects with some level of fidelity, it may be possible to automatically provide the pilot cues on the HUD about "significant obstructions in his field of view at a particular location."

AFRL will explore automating obstruction detection and assessment to provide head-up cues. Potential techniques include measuring the height, length and width of objects and roughly classifying them. (SAR in effect generates a large antenna — with higher resolution — by integrating data during flight in an arc around the target.)

These experiments will use MMW radar equipment provided by BAE Systems under a support contract. In actual use, an aircraft would not fly directly to the descent point but to an offset point about 3 kilometers (2 miles) away. The pilot would then roll toward the descent point, collecting data for about a kilometer on the wayinto it, Harrington says. The GMTI mode — which would be part of the landing mode — would be activated at that point, establishing tracks for the location and direction of movement of each object on the runway.

Beyond AALC

Beyond the scope of AALC, the directorate plans to investigate "short-pulse gated ladar" (laser radar) technology. The needs of the targeting and surveillance communities are driving this area of research; however, it ultimately could benefit all-weather approach and landing efforts, given its potential to see through heavy rainfall. This technique requires the "camera" (as the receiver is called) to be turned on and off at precisely controlled intervals in order to exclude the reflections from rain yet include returns from the target.

A nearer-term, partial approach to MMW’s rain problem is the circular polarization of MMW radar. This can be approximated on a fan beam radar with cross-polarization–where the radar transmits in one axis and receives in the other.

Database Monitoring

Database monitoring technology could increase the certifiability of synthetic vision system (SVS) presentations on civil air transport aircraft, where SVS may have to meet a 10-9 requirement for data integrity and reliability.

NASA planned to demonstrate such technology in flight tests beginning last month on a Gulfstream V. The approach takes raw aircraft sensor data and uses it in real time to determine the correctness of onboard database-generated SVS terrain information displayed in the cockpit.

It’s assumed that the sensor information and navigation solution are correct. The big question is, does the SVS terrain match the terrain in the real world?

Radar altimeter data, for example, is compared with SVS data about the height above the terrain, providing a "look-down" integrity monitor. Similarly, clutter information from the weather radar is compared with what the database says about the terrain in front of the aircraft, providing a "forward-looking" integrity monitor. Eventually, the software, now hosted on a PC, could be part of an SVS system.

In the flight tests, SVS data will be integrity-checked and displayed simultaneously. Part of the effort will be to consier whether to take possibly misleading data off the screen after checking it or not to put it on the display in the first place.

Sensor Fusion

Sensor fusion is an important aspect of the work on all-weather head-up displays. In its new Autonomous Approach and Landing Capability (AALC) program, the Air Force Research Lab expects the use of contrast-based algorithms, for example, which provide a way to handle dramatically different spectrums. The technique looks across several sensors — active or passive — imaging a local area and leverages the sensors providing the greatest contrast or variation. This maximizes processing efficiency and procures the most usable information.

All of the sensors are used all of the time. But fusion algorithms choose the best information out of all the sensor sources for further processing and display. Thus, as the weather becomes less severe, the sensor information selected would shift from radio frequency to infrared or electro-optical. The contrast technique can be further refined to attain a "pixel-to-pixel" algorithm, which basically looks at every pixel from every sensor input for pixel-level fusion

Receive the latest avionics news right to your inbox