What is Thermal Inertia?
"the degree of slowness with which the temperature of a body approaches that of its surroundings and which is dependent upon its absorptivity, its specific heat, its thermal conductivity, its dimensions, and other factors"
Source: Meriam-Webster Dictionary
Think of an ice cube. When placed in direct sunlight it does not instantly evaporate. It changes from solid to liquid and then evaporates. It takes time.
So what does Thermal Inertia have to do with Calibration?
Let's talk about measurement & test equipment made from steel. The coefficient of expansion for steel is 0.00000645in/in/deg F.
This seems to be a rather small number.
We need some context.
Picture a railroad track that is one mile long.
There are 5,280 feet in a mile x 12 inches= 63,360 inches.
63,360 x 0.00000645in/in/deg F= 0.408672 inches per degree.
That means the railroad track will change by 0.408672 inches for every 1 degree change in temperature. If there was a 20° degree change over the day the one mile track would change by 8.17344 inches.
Now let's go to the calibration laboratory. It is common practice to maintain temperature as near to 68 °F as possible. Why 68?
Here is a little historical background.
The debate had been going for quite some time and in 1927 a recommendation was made by the Bureau of Standards:
"Recommend that the Conference adopt 20 °C (68 °F) as the standard temperature at which industrial standards of length shall have their correct nominal length. A committee of five members was set up to study this question and report before the first of March, 1929. The following were the members: Bureau of Standards, Washington; Laboratoir d’Essais du Conservatoire des Arts et Metiers, Paris; National Physical Laboratory, Teddington; Physikalisch-Technische Reichsanstalt, Charlottenburg; Director Guillaume of the International Bureau"
Consensus had been reached in the United States:
"In reply to your letter of April 13th, we wish to be placed on record as favoring 68 °F. (20 °C.) as the standard temperature for intercomparison of standards of length. This temperature is one which most nearly approaches the average shop condition, and is a temperature in which production can be maintained as efficient standards.
1928-04-19 Pratt & Whitney response
“Our practice is invariably to give 68 degrees Fahrenheit as standard, and we have no doubt that this practice is well nigh universal in the engineering and inspection departments of American Industry.” 1928-04-16 Taft-Pierce response "Replying to yours of the 13th, we certainly favor the use of 68 degrees Fahrenheit as a standard temperature for intercomparison of gauges, etc
1928-04-16 Taft-Pierce response
"Replying to yours of the 13th, we certainly favor the use of 68 degrees Fahrenheit as a standard temperature for intercomparison of gauges, etc."
1928-04-17 Brown & Sharpe response
ANSI/ASME B 89.6.2 Temperature and Humidity Environment for Dimensional Measurements explains why 68°F provides the true form of an object.
So where is the connection between calibration and thermal inertia?
When measurement & test equipment are prepared for calibration there is an acclimation time so the Unit Under Test will be the same temperature as the standard. The theory being that minor changes in temperature( 0.5°F - 1.0°F) over a short time frame will not dramatically change the measurement.
Conclusion: When you ask for a RUSH calibration or bring gages in from the plant floor to a climate controlled room, you need TIME for the temperature of the unit being calibrated to reach the temperature of the standard.
Don't forget about Thermal Inertia and Calibration.