Welcome to our new Instron Community Blog hosted by Instron. It is a compilation of the freshest, brightest, most-talented minds that Instron has to offer. The world of materials science is so vast and encompasses the broadest range of industries, materials, and challenges that no one person can possibly possess all the knowledge required to be the resident expert – or master of materials science. It takes a small army behind the scenes collaborating and sharing technical know-how, experiences, and ideas to present the most accurate, relevant, and timely information to you – our readers.

We invite you to tell us who you are, share your stories and talk about your experiences. Join the Instron Community.

Thursday, July 26, 2012

Data Rate and Bandwidth

Performing tests with the appropriate data rate and bandwidth is critical in obtaining accurate and meaningful data.

The data acquisition rate of a computer or data acquisition system is the speed at which raw data points are captured. The rate should be based on the speed at which the incoming signal is changing. If the incoming signal is not changing quickly, then high data rates lead to excessive data being captured with large files and wasted disk space.

The following variables should be considered in the data rate discussion:

  1. Actual signal being measured
  2. Bandwidth of the signal conditioner (filtering)
  3. Data acquisition rate

Actual Signal

One of the most critical aspects of proper measurement is to understand the rate at which things occur during a test. For example, when testing composites, the signal contains short, sharp peaks (or signals) indicating that the data is changing quickly, whereas tensile tests on plastics typically do not show high frequency changes.

 

Bandwidth

In order to properly capture these actual events, signal conditioning with the correct frequency bandwidth needs to be calculated.

Bandwidth can be loosely defined as the frequency above which signal changes are not measured; those changes are filtered out and flattened. For example, it is not possible to measure 100 Hz peaks with a 10 Hz signal conditioner; the peaks will be invisible.

 

Data Acquisition Rate

The ideal data acquisition rate is a function of the signal conditioning bandwidth, which should be matched to the rate of change of the actual event. A rule of thumb is that a data rate more than 10 times the signal conditioning unit bandwidth produces little more than wasted disk space, because the same data is being sampled over and over.

For a complete review of data rate, signal conditioning, noise filtering and how they affect mechanical testing results, consult ASTM Standard Guide E 1942.

4 comments:

Lorenzo M. said...

Ah yes, great subject! Perhaps another way to approach it is to consider the Actual Signal described above as the "Event" that one wishes to measure. E1942 goes into some detail about defining the event (usually a peak of some sort) and how to accurately measure it. Some simple guidelines might go like this: think of the event one wants to capture (e.g. a peak). Try to come up with a duration of this event (how fast, what frequency, etc.). Then decide the accuracy to which one wants to measure that event - this will determine the bandwidth of the signal conditioning required. Then, 10x the bandwidth is a good guideline for the maximum necessary data rate. In most cases, the events in tensile tests happen slowly and general settings are fine. For tests lasting only a few seconds or less, taking care to consider these items is good practice!

Frank Lio said...

Beware that more does not necessary mean better. Too high a bandwdth buys you nothing. In fact, you may be recording noise!

Sergio said...

In the electronics field, where data rate and bandwidth are always considered there is a technique called "oversampling". If data is processed correctly by using this technique, one can analyze signals bellow the noise floor of the system, or signals that are not evident in the time domain but in the frequency domain by means of an FFT. In this case, capturing more points not necessarily means wasted space in your hard drive but very valuable data.

Frank L said...

Good point, Sergio. Oversampling should be used to some degree for anti-aliasing, improved resolution, noise reduction/cancellation, etc. The "Garbage In, Garbage Out" saying still applies. It’s finding the right balance between real time data acquisition, CPU processing time, storage, etc. Subscribing to E 1856 – 97 (2002) Standard Guide for Evaluating Computerized Data Acquisition Systems Used to Acquire Data from Universal Testing Machines section X2.4 Required Bandwidth: ”an excessively high bandwidth can be detrimental to the system’s performance as accuracy may degrade and noise may increase…For testing under conditions similar to those found in Test Methods E 8, a bandwidth of at least 0.2 Hz is usually sufficient.” Section X2.7 Determining Required Data Acquisition Rate mentions that “The minimum required data acquisition rate can be estimated from the required bandwidth. It can be shown that for less than 1 % error, a minimum data acquisition rate of about 31 times the required bandwidth (samples/s) is sufficient.“

One problem is that some instrument suppliers simply switch to a very high bandwidth for fast tests and select the highest transducer point recorded as a peak point – meanwhile this can be an extraneous noise point. Meanwhile, the actual test curve is so noisy that a modulus cannot be extrapolated.