WELFARE PUBLIC HEALTH SERVICE
FOOD AND DRUG ADMINISTRATION
Related Program Areas:
ITG SUBJECT: HERMETICALLY SEALED ELECTRONIC COMPONENT LEAK DETECTION
The leak detection system discussed in this ITG is a mass spectrometer leak detector tuned to detect small quantities of helium. It is utilized by pacemaker and pacemaker electronic component manufacturers to test electronic components (integrated circuits, transistors, capacitors, etc.) for hermeticity. The system is typically portable (on casters) or bench mounted and operates from a 115 volt, 60 Hertz power source. The system contains one or more vacuum pumps, a magnetic mass spectrometer and auxiliary components necessary for proper operation.
Industrial users employ mass spectrometer leak detection on all sizes of objects from miniature components to large systems and there are various detection methods used. The most popular leak detection method used by pacemaker manufacturers to leak test electronic components is the bell jar or hood method using helium as the tracer gas. In this method, the test object is placed in a pressure chamber and the chamber is filled with commercially pure helium at a specified pressure. The test object is held in the pressurized chamber (soaked) for a specified time (bomb time). Pressure and bomb time vary according to the test specification used. Typically, bomb pressure is four atmospheres minimum and bomb time is one to four hours. 2,3,4 If there is an opening in the test object, the pressurized helium will be forced through the opening into the test object. The chamber pressure is then released and the test object is transferred to the leak detector hood or bell jar. The bell jar interior is connected through a valve to the leak detector mass spectrometer tube. Transfer time from chamber to bell jar should be kept at a minimum to prevent loss of helium from the test object. Military and industrial specifications typically specify a maximum time between removal from the pressure chamber to detection of 30 minutes. 2,3,4
When the detector test cycle is initiated, the test station automatically evacuates the free volume under the test dome to a vacuum level compatible with the interior of the mass spectrometer. The valve then opens allowing any tracer gas leaking from the test object to enter the detector spectrometer tube.
Operation of the mass spectrometer is similar to standard mass spectrometer operation. In the described system, the gases entering the spectrometer tube, such as nitrogen, oxygen, carbon dioxide and helium (if a leak occurs) are ionized by an electron beam. The spectrometer magnetic field separates the resulting ions according to mass. The helium leak detector magnetic field is arranged such that only helium has the right mass to reach the detector. As the helium ions strike the detector, a minute current flow is generated. The current flow is amplified and the amplified flow (which is proportional to the amount of helium in the tube) appears as a visual leak rate indication on the leak indicator meter. 6 The leak rate indicated on the detector meter is the equivalent air leak rate.
The measured leak rate is the quantity of gas in cubic centimeters that flows through an aperture or porous wall in one second as determined under specified conditions. 5 It is assumed that the gas is air at room temperature, and is at one atmosphere pressure on the high-pressure side of the leak and the low-pressure side (vacuum) has a negligible effect on the flow rate. Leak rate is commonly given in units of atmosphere-cc/sec. The air leak rate can be converted roughly to a helium leak rate by multiplying: air leak rate x 2.8 = helium leak rate. The helium leak detector discussed here is used to detect leak rates in the 10 -4 to 10 -10 atm cc/sec range (fine leak) although units are now being developed to detect gross leaks, 10 -4 atm cc/sec or greater. There is presently no technical basis for maximum allowable leak rate specifications. The reason for this is that there are no data presently available which can be used to relate leak to component life. 1 Acceptable leak rates may vary as the internal free volume of the test object varies.
The sensitivity of the described system is such that gross leaks, or leaks with flow rates 10 -4 atm cc/sec or greater cannot be accurately detected. 6 Also, the tracer gas would be removed from test objects with gross leaks when the system evacuated the bell jar and little or no helium would reach the spectrometer tube. Some helium is removed from objects with fine leaks during evacuation but the evacuation time is small (typically three seconds) and detector manufacturers state that enough helium remains for detection purposes. This is a questionable area. It is obvious that bomb time and pressure become important. It is also apparent that test dome or bell jar volume should be kept small to decrease evacuation time.
For leaks 10 -4 atm cc/sec and larger (gross leaks) a bubble test is commonly utilized. The test object is soaked in a pressurized chamber filled with helium or other gas (as done for fine leak testing) and then the test object is immersed in a silicone or mineral oil, or fluorocarbon liquid and observed for bubbles emanating from the object. 3 Gross leak testing should not be performed before the fine leak test as there is a possibility that the test liquid could temporarily plug a fine leak.
Calibration of the helium leak detector is presently accomplished using a calibrated helium leak. The calibrated leak is typically in the form of a cylinder charged with helium at atmosphere pressure. The cylinder contains a filter through which helium exits at a fixed calibrated rate when the cylinder valve is opened. The temperature at which the calibrated leak was calibrated is marked on the cylinder (typically 22-23 C) and the calibrated helium leak cylinder should be at this temperature when calibrating the system or a temperature compensation factor should be provided and used in calculating the test object leak rate. The actual accuracy of calibrated leaks are questionable due to the lack of standardization in calibration methods and the disagreement between different calibration labs as to the accuracy of the calibrated leak rate. When using the calibrated leak to set the sensitivity of the helium leak detector, the detector meter is set for direct readout at the air leak rate figure marked on the calibrated leak cylinder.
Radioisotope and weighing are two other methods used for leak detection. Radioisotope leak testing is generally felt to be a better leak testing method than helium leak detection. In this method, the test object is soaked in a pressurized chamber of radio-active gas. The object is then removed and the emissions of gamma rays penetrating the walls of the object are counted, thereby measuring the amount of radioactive tracer gas trapped within the leaking object. An Atmoic Energy Commission license is necessary for possession and use of radioisotope test equipment and manufacturers are reluctant to use this method. Radioisotope leak testing will be covered in more detail in a future ITG.
A weight test method is also used in which the test object is weighed before and after being pressurized in a test liquid, or before and after an extended time period.
The detection of loss of component package integrity is important because entrance of damaging contaminants will reduce the components effective life. Water vapor both sealed-in and that which leaks into the package, is a contaminant of major concern. Moisture inside the package may result from package leaks or the moisture may be sealed in during component manufacture. Sealed-in moisture may result from improper or inadequate handling or processing of materials. For example, the walls of ceramic packages are a sink for moisture which may later serve as a moisture source after sealing. 1 Glass, epoxies, shellacs and polyimides are also sinks. A proper bake-out period and subsequent sealing in a moisture free environment can minimize moisture in these areas.
Hermetically sealed components are typically evacuated or are sealed in a dry nitrogen atmosphere. Some component manufacturers are now including helium as part of the component internal atmosphere to facilitate leak testing. The tubing through which the sealing gas passes may emit moisture and contaminate the sealing gas. Also sufficient moisture may penetrate the tubing to contaminate the package. Dynamic flow conditions should exist to minimize moisture. Sealed-in moisture may be sufficient to block fine leaks so that they are not detected. 1
To minimize leaks in the package hermetic seals, care should be taken in controlling sealing materials. For example, sealing material additives designed to adjust the thermal expansion coefficient between metal and glass may be contaminated or not be in proper balance. 1 Inadequate control over the mechanical handling of packages can also result in degradation of the package seal.
There are presently a number of test specifications in use, both military and commercial, for performing helium fine leak detection and gross leak detection. The most popular specifications appear to be Military Standards 883, 750, and 202 (MIL-STD-883, MIL-STD-750 and MIL-STD-202) and the American Soceity for Testing and Materials (ASTM) F/34-72T specification. Results obtained from the individual test procedures have been shown to vary in some areas. 1 In the absence of a standardized widely accepted test method, it is important that device manufacturers a) know the scientific capabilities of the methods being used; b) conform to their own stated procedures and specifications and c) properly calibrate and maintain their equipment.
- RPA/NBS Workshop II. Hermeticity Testing for Integrated Circuits.
- Military Standard 883
- Military Standard 750
- Military Standard 202
- ASTM F/34-72T
- Veeco Instruments, Inc. Model MS-17 Mass Spectrometer Leak Test Stations Operation and Maintenance Manual.