Calibration is the Key to Successful Leak Detection

Posted by Portable Leak Detectors on Sep 13th 2021

A common leak detection method utilizes a sniffer leak detector to locate leaks in a part or system pressurized with a tracer gas (typically helium). While very useful in pinpointing the location of a leak, sniffer detectors are not as reliable in quantifying leaks compared to other methods. Furthermore, sensitivity is much less than with hard vacuum helium leak detection, and varies among different types of detectors. However, with proper calibration, the sniffer leak detector can be used as an effective tool for both production and maintenance leak detection applications.

Sniffer leak detectors draw small amount of air through the sniffer probe by means of a pump and then measure the concentration of the tracer gas in the drawn sample. Calibration of the sniffer can give the operator confidence by providing a relative comparison of the displayed leak rate to an actual leak value, and by proving that the detector is sensitive enough to detect a particular leak rate. It can also be helpful to illustrate correct techniques for maximizing sensitivity and success in locating existing leaks (i.e., how close the probe needs to be to the leak, at what speed the probe should pass over a leak in order to detect it, or the detector response time). Calibration of a sniffer leak detector is facilitated by a calibrated leak standard, which is a device that produces a measured flow of gas simulating an actual leak.

Normally the leak rate value of the calibrated leak standard is at or near the minimum desired sensitivity or reject level. To perform a sniffer calibration, the background tracer gas concentration should be measured and verified to be below the leak rate value of the calibrated leak. This is done by taking a reading of the ambient air. If ambient tracer gas contamination is suspected, samples should be taken in another location. If it is determined that the test area is contaminated with tracer gas, then the area should be purged prior to calibration. Following verification of the tracer gas background, the sniffer probe is inserted into the outlet of the leak standard, and left there until the reading from the sniffer stabilizes. The displayed reading is compared to the leak rate value of the leak standard, from which a quantifying relationship can be made for future readings (the leak detector may or may not display a value with any proximity to the value of the actual leak that it is reading).

Some instruments allow the user to enter in a calibration factor that will adjust the display to match the reading from the calibrated leak standard. Additionally, calibrating will determine whether the sniffer is capable of detecting the size of leak simulated by the leak standard. Many sniffer leak detectors display readings in units of concentration (parts per million, for example), whereas most leak standards are specified by leak or flow rate (typically atmcc/sec). Concentration can be correlated to leak/flow rate if the pumping speed of the sniffer line of the detector is known.

TO CORRELATE PARTS PER MILLION (PPM) TO ATMCC/SEC:

atmcc/sec = ppm x 10-6 x S
where S is the pumping speed of the detector in cc/sec

Calibration will be most successful when the amount of gas from the leak standard that enters the probe is maximized, while not impeding normal air flow into the probe. For this reason, an appropriate outlet on the leak standard is important. An outlet that allows leaked gas to escape beyond the probe or does not allow proper air flow will result in false readings making the calibration less effective. Also, the sniffer probe should be very near the actual gas exit point of the leak standard so that the tracer gas does not become too diluted by the surrounding air. Other factors that could negatively influence a sniffer calibration are turbulent air surroundings (such as in front of a fan), or surrounding air that is heavily contaminated with the tracer gas. Manufacturer-recommended practices that are specified in the instrument user manual (such as zeroing the baseline reading) should also be observed. Calibration should be performed regularly during any leak detection process to verify the integrity of the process. A recommended practice is to calibrate the instrument prior to testing a system or at the beginning of the production shift, then verifying calibration when complete.