Dan Kilfoyle, technical director for electronic warfare systems with Raytheon Space and Airborne Systems. Photo courtesy of Raytheon
Understanding and managing, and if necessary, controlling and denying the electromagnetic spectrum are as critical for national defense as an army, navy, air, or space force. After decades of inattention, that fact has finally hit home, fueling efforts to improve electronic warfare (EW) technologies through means such as machine learning.
While cognitive EW is a work in progress, one trend seems clear. If the miniaturization and density of electronics components continue to increase — and if cooling technology keeps up — this will drive radio frequency (RF) system functional consolidation and enhance sensor performance.
The future will see multispectral, multimode and multifunction capability, said Chris Rappa, product line director for RF, EW and advanced electronics with BAE Systems’ FAST Labs research and development organization.
Active electronically scanned arrays (AESAs) are already multimode but over a narrow band, he said. BAE aims to build large or small totally digital arrays, where the electronics behind every element in the array are digital and the array can be controlled in every aspect at the element level.
Ten years from now Rappa expects to see very large, all-digital, precisely controlled arrays that are multifunction, multimode and capable of learning on the fly to be cooperative or disruptive, whenever they need to be. He expects they’ll be highly flexible — able to do signals intelligence, electronic support measures (ESM), electronic attack (EA), radar, positioning, navigation and timing (PNT), and communications, all from one array and one box, and all cognitively and adaptively controlled.
Much depends on continuing advances in semiconductors and cooling. Board designer Abaco Systems envisions an RF processing power and bandwidth “escalation race” becoming faster paced. Cognitive RF and EW, for example, call for reconfigurable multiprocessor architectures featuring components such as low-latency field programmable gate arrays (FPGAs) and graphics processing units (GPUs), as well as general-purpose processors.
If these micro-level trends continue, we may see large increases in performance. Once you have an array with thousands of elements pumping out digital data, Rappa predicted, the instantaneous bandwidth of the system and the data volumes produced increase exponentially. Such a system would be able to look a lot wider and deeper into the spectrum, with a lot more sensitivity, he added.
“Systems will have to become much more ISR-like,” said John Thompson, naval aviation campaign director with Northrop Grumman’s Mission Systems Sector, referring to the exquisite fidelity of intelligence, surveillance and reconnaissance sensors that currently require huge apertures and massive processing resources. “But how do you get that really deep knowledge of RF signals inside tactical fighters?”
We could also see multispectral fusing, combining data from the RF and optical spectrums, Rappa said. The more data inputs, the better for identification purposes, just as multiple human senses — eyes plus ears — complement each other.
EW systems also will become a lot smarter. Dan Kilfoyle, technical director for electronic warfare systems with Raytheon Space and Airborne Systems, expects future systems will be looking at more complex data sets, including the context of signals. In addition to measuring the usual parameters, systems will ask, what else is going on in the theater right now? What’s the normal behavior in an area? What does a system do when it thinks it saw me? Over time, AI reasoning will become more complex, just as a person progresses from making sounds to saying words and eventually to having more and more complex thoughts.