A calibration lab is all about the details, from the equipment to the temperature. All labs need to be kept at 68ºF to get accurate results. But why 68ºF? And according to whom? Read on to learn the significance of temperature in calibration, and why 68ºF is the winning temperature. Standards in the art of calibration Calibration is an absolutely necessary part of any trade involving measurement instruments. The process, when done correctly, will verify the precise accuracy of a measurement instrument’s performance. In order to be done correctly, there can be no room for variance within the calibration lab. Every element of the process needs to be stable, consistent, and meet a specific and controlled standard. Specialists like Alliance respect the careful process of calibration, which requires careful training and extraordinary attention to detail. The rules of calibration cannot be random, because the process itself controls the quality of measurements, which in turn affects the quality and functionality of all sorts of machinery and equipment. Precision is essential in every detail for every step of the way. It’s fascinating to think how the smallest inconsistency — such as, a degree or two in temperature — could end up ruining a machine by way of an incorrect calibration on a tool used. In order to maintain consistent measurements There are both national and international standards for the different elements within the calibration process. For instance, the National Institute of Standards and Technology sets standards for many areas of science and technology, including calibration. These regulations set forth a baseline that everyone can agree to and follow. When everyone follows the industry standards, it helps ensure that the instrument will perform correctly as conditions change. Generally, no two labs are going to have the exact same weather conditions, temperature, and humidity outside. Standards such as those set forth by the NIST tell people how they should control the elements of their lab space to get consistency across the industry. The impossibility of quick turnarounds on dimensional calibrations Temperature doesn’t have an affect on every type of calibration, but when performing a dimensional calibration, it’s an essential element. A dimensional calibration is conducted on an instrument that collects the physical measurements of an object, like its width, flatness, or diameter. Tools that would need a dimensional calibration include things like calipers, micrometers, gage blocks, and measuring rods. Dimensional calibration on these instruments can never be rushed because of the role that temperature plays on these objects and their measurement accuracy. Temperature is one of the controlled standards in a calibration lab, and a dimensional gage instrument needs to have time to sit and reach that temperature before it can be calibrated. At this point, you may be wondering — what exactly is the big difference between a micrometer that’s been sitting in a cold car, and a micrometer that’s been sitting in a calibration lab? As with calibration, the answer has to do with the tiny details, which are affected by something called thermal inertia. Thermal inertia is the rate it takes for an object to rise or fall to the temperature of the space it’s in. The thermal inertia of an object varies greatly depending on the material it’s made of. Most dimensional gage instruments are made of hard steel, which has a thermal inertia significant enough to affect the calibration process. Metal Type Coefficient of thermal expansion Galvanized steel 6.7µ in./(in.°F) Steel 6.7µ in./(in.°F) Copper 9.4µ in./(in.°F) Stainless steel 9.6µ in./(in.°F) Bronze 1.01µ in./(in.°F) Aluminum 1.29µ in./(in.°F) Lead 1.51µ in./(in.°F) Zinc 1.74µ in./(in.°F) Each instrument has a Thermal Expansion Coefficient, which describes how much the object will change with the temperature. A metal instrument will expand as it gets warmer and contracts as it cools. Although this difference is much too small to notice with the naked eye, you can see how this would make a difference in precision calibration. Projects and equipment that require larger measurements — like building a shed — can get away with ignoring an instrument’s thermal expansion. But, for something like mechanical engineering, it’s a different story. These settings require temperature-critical measurements to ensure precision every step of the way. This is the long answer as to why you can’t rush a dimensional calibration. Our labs are kept at 68ºF, which is the dimensional laboratory calibration temperature standard. Dimensional gages are only ready for proper calibration after they’ve had time to acclimate to the lab’s conditions. But why 68ºF? Sixty-eight degrees Fahrenheit sounds like just the kind of weather you’d want for a nice day in the park, and it just so happens to be the perfect temperature for dimensional calibrations. However, this standard is far from arbitrary. It actually took two decades of discussion and consideration before those in the industry decided on 68ºF/20ºC. Many correspondances took place between the Bureau of Standards first director Dr. S. W. Stratton and his peers in Europe before they could settle on a number. Each side was taking input from their industry professionals, and arguing to stick to their area’s current standards. The industry could not come to its final decision until it sought the help of C. E. Johansson, a Swedish inventor who designed the first gage block. In his letters, Johansson offered his experience in dealing with varying reference temperatures. He then conducted several experiments during which he tested the accuracy of measurements at different temperatures. He found that most labs using his gage blocks kept their internal temperature at about 20ºC, so he used this as his own standard. The wide usage of the temperature was a big convenience, but not the only reason why Johansson decided on the 68ºF/20ºC standard. For one thing, when he made gages at 62ºF — another popular temperature for calibration — the body heat of the operator would have an effect on the gage’s temperature before they could complete their work. Johansson also appreciated that both 68 and 20 were integers, which made all calculations much more convenient for everyone. Finally, on April 15, 1931, the International Committee for Weights and Measures announced the new 68ºF/20ºC standard. The duration and complications of this argument just go to show the long standing commitment to accuracy and consistency in metrology. You may also want to read: Thermal Inertia and Calibration