ATM Modernization, Business & GA, Commercial, Military

How Test Equipment Gains Intelligence

By Walter Shawlee 2 | August 1, 2001
Send Feedback

Almost unnoticed, a significant revolution has been going on in the world of test equipment over the last several years. The battle seems to have been largely decided in favor of intelligent instruments over the simpler "stand-alone" digital or analog multimeter of yesterday.

Originally, adding some "intelligence" was only a differentiating technique to garner market share for a specific brand. But the drive to make instruments smarter and more automated has pushed relentlessly the entire test equipment industry along with it. These changes have altered many of the fundamental ways measurements are made and certainly what is possible with the data after it’s collected.

Meters, scopes and other instruments outside the high-end laboratory market began to get smarter in the mid-1980s as low-cost microprocessor and foundry-based, custom integrated-circuit (IC) technology began to percolate down to all reaches of the instrument world. It became no longer necessary to actually be an IC manufacturer (as both Hewlett Packard [HP] and Tektronix were) to have parts made specifically for you. It merely became necessary to solve a problem economically with the technology semiconductor foundries offered.

Many custom display/conversion ICs and instrument subsystems were soon created to take over tasks that previously took a square foot of board space in digital multimeters (DMMs) and counters. Single chips soon appeared that contained the heart of a counter, a time-base divider, display driver, and other distinct functions. This dramatically reduced costs, size and maintenance of instruments. Also, with some internal computing power, instruments began to do things automatically.

One key change in the design of test equipment was a seemingly trivial feature introduced by Fluke in its hand-held meters, which proved to be a powerful marketing decision. The company noted that the biggest complaint people had was accidentally leaving the meter on and draining the battery. Thus, the next time the meter is needed (usually desperately), the user encounters a featureless gray display and no available spare battery.

This common problem was not a trivial design task to correct. However, Fluke’s quite ingenious technique worked transparently to the user. The meter just shuts off most internal system power via field-effect transistor (FET) switches controlled by internal timing logic. Making any control change can turn it on again. Painless and effective if the right circuitry exists inside. No doubt the battery industry is bitterly disappointed. But users love it.

Another key step in smarter instruments was "autoranging," where the meter or scope figures out for itself the best range and settings to display the measured signal. This was a landmark intelligence feature in DMMs. It eliminated a host of mechanical switches and high costs, and it prevented the accidental destruction of instruments, to the delight of many users.

This desirable feature was thought for a long time to be impossible in hand-held instruments. But once it was realized, it became a virtual requirement. This got many designers thinking about how to give their instruments some real intelligence. They soon pursued ways to make instruments even more useful to the person at the end of the test leads, and incidentally kill off more of their competition in the process. Functions like built-in math, data storage and communication, impedance/dB scaling, signal averaging, and all kinds of triggering and filtering began to appear in small and low-cost test equipment.

Large, power-hungry lab instruments from HP (now Agilent Technologies) could do these tricks, and even communicate externally to other systems with an HP interface bus (HPIB). However, the high prices and large sizes were not ideal, and hand-held/portable instrument makers started to enjoy some serious market acceptance.

Fluke benefited handsomely from the change to hand-held digital instruments. So did Asian manufacturers, which flooded the market with low-cost instruments. Their low production costs, easy access to custom liquid crystal displays (LCDs), and good foundry relationships allowed them to move aggressively into the test equipment world.

Meters began to acquire many other features in their basic architecture, once a microcontroller core and some flexible logic was present inside. Features like minimum/maximum value or stored values over time began to appear. This transformed hand-held meters into effective data loggers or trend monitors, a powerful weapon for finding elusive intermittent faults.

These capabilities, in turn, required some way to offload data, and that led to a low cost, optically isolated serial interface. This allowed the innocent looking hand-held meter to suddenly become a full-blown data-acquisition system, with all the computing power behind it you cared to attach. And software running on the host computer allowed data capture and analysis to be every bit as sophisticated as in high-end lab instruments, HPIB/IEEE488 interfaces, and hard-to-program controllers. The intelligence revolution was really picking up steam; now you could toss the results in your toolbox for a few hundred dollars.

Some people may remember that European electronics giant Philips was once big in test equipment. It teamed with Fluke in the United States in the mid-’80s to jointly develop and market new instruments. One creation was a rather revolutionary gizmo called a scopemeter.

Now other high-end test equipment makers like Tektronix have dabbled in this area, trying to make a digital hand-held scope. But Fluke had the long history of making portable instruments and knew more of what was required. And while the early Fluke 90 series scopemeters were not ideal instruments, especially in terms of their blue-on-green displays, the functionality was astonishing. The high-voltage/ UL/CSA safety ratings, "auto-set" function, and instrument isolation became a hit in the industrial/process control markets. Schools, in particular, loved the Philips/Fluke scopes with "auto-set," as new students had real trouble using scopes with complex triggering and a forest of controls.

The Graphical Multimeter

These early scopemeters soon spawned a new kind of instrument, the graphical multimeter. It has basic waveform display and large multiple parameter data display, a boon to many in the electrical, industrial and automotive businesses. Second-generation instruments in this family like the Fluke 123 scopemeter have all the features of a scope, datalogger, digital multimeter, and data collection system rolled into one–and a great multimode display. Plus, you can print the display results easily through your computer–a leap forward from that old meter type Volt Ohmmeter!

While the hand-held world was busy transforming itself, the lab equipment makers noted that they were rapidly losing market share. Wavetek (recently acquired by Fluke) and Tektronix responded by marketing an Asian line of DMMs and similar gear under their labels. Meanwhile, HP created its own products.

It is worth noting that digital instruments project an implied accuracy by their displays. A two-digit instrument implies 1% accuracy, three digits, 0.1%, and so on. Interestingly, many three-digit instruments have much poorer real accuracy, often no better than 0.5% or even worse on AC, making the last digit useless.

Few meter-based instruments can achieve accuracy beyond 0.5% except in the differential/nulling mode. Generally, they are in the 1%-to-3% range in terms of overall accuracy, but this is well understood.

Unfortunately, many digital instrument users produce lots of data of questionable quality. This is because they do not understand the true accuracy of their equipment and may not realize that their measurements may have far fewer significant figures than are displayed. I recently saw an imported DMM with an AC voltage spec of 0.7% plus several digits, a DC spec of 0.5% plus some digits, and yet it was a 4 ∏ digit meter! An amazing study in measurement misdirection.

The price erosion in test equipment for a specific level of performance has been significant, even if accuracy is sometimes uncertain. The product changes have been equally telling, with many analog families of equipment disappearing for good, as instruments shift into the digital area to provide the now "required" intelligence and interface capability.

Higher-end instruments have mainly moved to GPIB/IEEE488 digital interfaces. This allows fully automated test systems to be configured from building block parts and then integrated and displayed using high-end software tools like National Instrument’s LabView.

Basic instrument architectures have changed, and many new instruments exist only "virtually" in software. They appear in your software as the display you select. In this family of instruments, VXI and compactPCI cards allow almost anything to be built up from modules and spliced together in software. And the final test/instrumentation world is moving steadily towards a more "connected" model and farther away from stand-alone instruments.

In the most recent case, this trend has extended out of the lab and onto the Internet, allowing construction of systems that can be observed or controlled remotely, from half the planet away. These new Internet-protocol-aware measurement cores from companies like IOTech allow all kinds of measurements to be displayed on the computer screens of thousands of users around the world. Such technology can take everything from weather data to production test results and make them appear live as Web pages, accessible to anyone.

This is a long way from that analog meter or Wheatstone bridge your dad used, and possibly far from what is on your desk right now. But the change we see today mainly is how the resulting measurement data is delivered. The concepts of sound metrology, interconnect, calibration and correct interpretation are still with us, and even more important when the added problems of digital data extracted from an analog universe are considered. Amazingly, moving to the digital domain and making instruments smarter and more precise often shows just how poorly we actually do the task we set out to do. A sobering thought.

But a smart instrument is not a substitute for a smart user. It’s just there to make your task a little easier.

For additional information, visit www.fluke.com, www.tektronix.com, www.ni.com (National Instrument), www.iotech.com, and www.agilent.com.

Receive the latest avionics news right to your inbox