The following discussion provides definitions, explanations, limitations, and examples of metrology terms as it relates to DeFelsko's coating Thickness Gauges. The resources used to develop this literature mainly include technical articles and standards for certification by international organizations, ISO publications, ANSI, ASTM. The aim is to develop a common DeFelsko documentation consisting of literature, manuals, reference platforms for technical articles, communication and web materials.

Type one:Pull-off gauge
Type 1 pull (negotiable or posipen) gauge, where the long-term magnet is brought into direct contact with the coating surface. The force required to pull the magnet from the surface is measured and interpreted as a value for the thickness of the coating on the scale on the ruler or on the display. The magnetic force that holds the surface of the magnet is opposed to the nonlinear function of the distance of the magnet from the steel, i.e., the thickness of the dry coating. Less force is required to remove the magnet from a thick coating.
Type 2:Electronic gauge
The Type 2 electron meter (positector) uses an electronic circuit reference signal to convert into coating thickness. Electronic ferrous measuring instruments use two different magnetic principles. Some use a long-lasting magnet that, when approaching steel, increases the magnetic flux density on the pole surface of the magnet. The coating thickness is determined by measuring the change in magnetic flux density, which is inversely proportional to the distance between the magnet and the steel substrate. Hall elements and magnetoresistive elements located on the polar surface are common methods for measuring changes in magnetic flux density. However, the response of these elements is temperature-dependent, so temperature compensation is required.
Other black electronic meters work on the principle of electromagnetic induction. A coil containing a soft iron rod is charged through alternating current, which creates a varying magnetic field on the probe. As with long-term magnets, the magnetic flux density within the rod increases as the probe is placed close to the steel matrix. The change is detected by the second coil. The output of the second coil is related to the thickness of the coating. Due to the temperature dependence of the coil parameters, these meters also require temperature compensation.
Characterization
Characterization is the process of associating the signal received by an instrument from its probe tip with the actual coating thickness measurement. The result of the characterization process is a calibration curve built into the instrument. Depending on the complexity of the curve, it can also include other effects, such as allowances for ambient temperature.
Each DeFelsko instrument is individually characterized using calibration standards, covering the full range of measuring instruments. It is this feature that allows DeFelsko to take a meaningful measuring instrument straight out of the box for most applications.
Reference standards
A reference standard is a sample of known thickness, which the user can use to verify the accuracy of its thickness. Reference standard: General coating thickness standard or gasket. If the party agrees, a sample fraction of known (or acceptable) thickness may be used as a thickness criterion for a particular job.
Coating thickness standard
For most instruments, the coating thickness standard is typically a smooth metal substrate with a thickness of a non-magnetic (epoxy) coating traceable to the National Standard (NIST). The substrate is a magnetic gauge or non-ferrous (aluminum) iron (steel) used to measure eddy current meters. High tolerance coating thickness standards are used to measure and calibrate gages as part of the manufacturing process. Customers can purchase the same standard as a calibration standard for a calibration laboratory or as a field or factory inspection standard.
The coating thickness standard used with ultrasonic measuring instruments is a solid plastic (polystyrene) block that has been processed into a flat, smooth surface. In addition to the known thicknesses that can be traced back to national standards, these standards also have a known speed of sound.
Calibration standards: Purchase accessories to help meet the growing number of customer requirements to meet ISO/QS-9000 and internal quality control requirements. Many customers find that their meter calibration is more practical in-house than taking advantage of DeFelsko's calibration services. For the convenience of these customers, the calibration standard set is available with a selection of nominal values covering the full range of each DeFelsko meter. All standards are supplied with a calibration certificate showing NIST traceability.
gasket
A spacer is a planar non-magnetic (plastic) part of known thickness. When gaskets are typically able to measure the shape of the substrate, the accuracy of the gasket is more limited than the coating thickness standard. Therefore, when using a gasket to make a calibration adjustment to a Type 2 (electronic) gage, it is important to combine the tolerance of the gasket with the tolerance of the gage before determining the measurement accuracy.
It is not recommended to use a Type 1 (mechanical pull-out) gauge for gaskets. Such spacers are usually relatively rigid and bent without lying completely flat, even on smooth steel specimen surfaces. When measuring the near point of tension with a mechanical measuring instrument, the gasket often springs back from the steel surface, causing the magnet to rise prematurely, resulting in false readings.
calibration
Calibration is the process of controlling and documenting the measurement traceability of calibration standards and verifying that the results are measured at the specified accuracy. Calibration is typically performed by the gage manufacturer or by a qualified laboratory in a controlled environment using a documenting process. The coating thickness standard used in the calibration needs to make the combined uncertainty of the resulting measurements less than the specified measurement accuracy.
Calibration interval
Calibration intervals are established between instrument recalibrations. In accordance with the requirements of ISO 17025, DeFelsko does not include the PosiPen, non-magnetic part, positector 6000 and 100 coating Thickness Gauges issued to us with calibration intervals.
For customers seeking assistance in developing their own calibration intervals, we share the following experiences. Non-shelf-life-related factors are even more critical when determining calibration intervals. These factors are mainly the frequency of use, the application in question, and the level of care taken during use, handling, and storage. For example, a customer who frequently uses a gauge to measure on a worn surface, or who uses a gauge (i.e., a dropper, cannot replace the storage, probe cover, or often the table as a storage kit) may require a relatively short calibration interval. From theoretical analysis and practical experience, the influence of temperature and humidity on the gauge is very small. In addition, the manufacturing process is designed to minimize variations in measurement performance after calibration. Even in the case of drift, gauge measurements are usually linear and therefore compensated before use.
While DeFelsko recommends that customers establish calibration intervals based on their own experience and working environment, customer feedback suggests that one year serves as a typical starting point. In addition, our experience has shown that customers who purchase a new instrument can safely use the instrument purchase date as the beginning of their first calibration interval. The minimal impact of shelf life minimizes the importance of the actual calibration certificate date.
Calibration certificate
A calibration certificate is a document that records the actual measurement results and all other relevant information into a successful instrument calibration. The calibration certificate indicated traceable to the national standard included by DeFelsko with each new, readjusted or repaired instrument.
Traceability
Traceability refers to the ability to track measurement results through a series of uninterrupted comparisons, a method that can be traced back to a fixed international or national standard and is generally considered correct. The chain is typically made up of several appropriate measurement standards, each with higher accuracy and less uncertainty than its subsequent standards.
Calibration (Certification)
Recalibration, also known as certification, is the process of performing the calibration of an instrument in use. Periodically recalibration is required during the life of the instrument from the probe surface subject to wear and tear, which may affect the linearity of the measurement.
Theoretically, with a reference standard of thickness and calibration procedures available for copying from the website, customers can DeFelsko readjust their meters. Customers also limit their own quality system requirements as well as their ability to control calibration conditions.
Verification (Calibration Verification)
Calibration verification is an accuracy check performed by the instrument user against a known reference standard, including the expected range of coating thicknesses. The purpose of this process is to verify that the gage is still operating as expected.
Verification is usually done to prevent measurements with inaccurate gages at the beginning or end of the measurement, when the instrument is discarded or damaged, or when any erroneous readings are suspected, before a critical measurement is made. If the parties deem it appropriate, they may reach a preliminary agreement on the details and frequency of verification of the accuracy of the gage. If the readings do not match the reference standard, all measurements taken since the last accuracy check are suspicious. In the event of physical damage, wear and tear, high use, or after an established calibration interval, the gage should be removed from service and returned to the manufacturer for repair or calibration. The use of the check metrology standard is not intended to replace the regular calibration and validation of the instrument, but its use may preclude the use of the instrument that is discontinued between two formal confirmations.
Calibration Adjustment (Adjustment, Optimization)
Calibration adjustment is the calibration of a calibration thickness reading (to eliminate deviations) with a known sample to improve the accuracy of the gauge on a specific surface or a specific part of its measurement range.
In most cases, simply check the zero point on the uncoated substrate and start the measurement. However, the effects of properties such as substrate (composition, magnetism, shape, roughness, edge effects) and coating (composition, surface roughness), as well as ambient and surface temperature, may require adjustments to the instrument.
Most Type 2 gauges can be adjusted to known reference standards such as coated sections or gaskets. However, Type 1 gauges such as posipen and non-magnetic nonlinear scales. Since their adjustment characteristics are linear, no adjustments should be made. Instead, the user should take a base metal reading (BMR).
With a Type 2 gauge, the calibration adjustment method is not specified, a 1-pt calibration adjustment is usually the first. If a not-so-2-PT calibration adjustment is encountered, it should be.
1-pt calibration adjustment
1-PT calibration adjustment involves fixing the instrument's calibration curve at one point and taking a few readings on a known sample or reference standard. If desired, spacers can be placed on the bare substrate to determine the thickness. This adjustment point can be anywhere within the instrument's measurement range, but good results should be chosen close to the expected thickness measurement.
Zero is a simple 1-pt adjustment form. It involves the measurement of uncoated samples or plates. In a simple zero calibration adjustment, a single measurement is made, and then the read is adjusted to a zero read. Calibrating the adjustment at mean zero, multiple measurements, and then the average measurement reading is calculated and automatically adjusted to a value of zero.
2-PT calibration adjustment
The 2-PT calibration adjustment is similar to that of the 1-PT, except that the calibration curve of the instrument is fixed at two known points and a few readings are taken from a known sample or reference standard. Both thicknesses need to be within the measurement range of the instrument. Points are usually selected on either side of the desired coating thickness. The advantage of the PosiTector 6000 is its accuracy over the entire measuring range. This usually makes zero (uncoated) where one point is calibrated at 2-Pt.
Base metal readings
Base Metal Reading (BMR) is a zeroing technique that can be used on Type 1 (mechanically pulled) gauges on rough surfaces. The adjustment of the Type 1 gauge is linear in nature, but the scale of the gage is non-linear. Therefore, it is important not to adjust the reading to zero on the bare substrate. It is recommended to calculate the BMR of an unparted part and subtract the actual reading from it to reach the coated part. Representative values (averages) of basal metabolic rate calculations are measured from several locations on the bare substrate.
roughness
If the surface of the steel is smooth and flat, its surface surface is an effective magnetic surface. If the steel is roughened, clean it up like sandblasting.
