Calibration Services
A thermometer is an instrument that measures temperature. Clocks measure time. Hygrometers measure humidity. An odometer measures distance. An ohmmeter measures electricity. These common measuring instruments are just a tiny fraction of the wide range of measuring tools that are used in every corner of humanity to measure a vast array of subject matter.
Meters, gauges, sensors, and other testing methods are calibrated to scales that place a quantitative value on the media being measured. All of these devices are used throughout industry to provide readings of standardized units of measurement.
Calibration or recalibration of instruments is done to assure consistently accurate readings for the specified applications. Environmental conditions and general use over time may cause instruments to fall out of the scope of appropriate operation. Calibration services are available to test the instruments and make adjustments to ensure their continued proper function.
Quick links to Calibration Services Information
The History Of Calibration
The practice of measurement goes back nearly as far as mankind, with the two original objectives to quantify weight and distance. Standardized lengths and weights provided the basis for trade, and eventually commerce.
The first recognized distance measurement was the cubit, the distance from a man's shoulder to the tip of his finger. This was, obviously, not an accurate measurement as it depended solely on the size of the man.
During his reign from 1100-1135, King Henry of England declared that the distance from the tip of his nose to the tip of his outstretched thumb would be called a yard, thus standardizing the measure.
The Assize of Measures was established in 1196, in England, to standardize measurement of length.
In 1215, the Magna Carta stated standards for measuring wine and beer.
The mercury barometer, used to measure atmospheric pressure, was invented in 1643 by Torricelli. It was followed by the manometer, a water filled tube working on the same principle, to calculate air pressure in terms of inches.
In 1791, the metre was established in France, giving rise to the metric system, adopted in 1795.
The word "calibrate" became part of the English language during the Civil War. There is an argument whether it came from the Latin "caliber", meaning steel, or from the Arabic "qualib", meaning mold, as in casting. It was a reference to the amount of fire power, per shot, that a gun could provide, or its caliber. The caliber of a gun is the measurement of the inside diameter of its barrel, or bore, as well as the outside diameter of the bullet that passes through the bore.
Calibration, then, was the act of measuring the bore and the bullets, during manufacture, to achieve consistent ammunition delivery from each gun being made. During the 1860s, almost half a century before the Industrial Revolution, this was a process handled by only the best forges and metal smiths.
With the Industrial Revolution, came the need for consistency of materials, products, and delivery. More accurate scaling of quantity was needed in every aspect of measurement. As industry boomed, so did the development of more effective ways to quantify it- and to standardize those quantifications into usable units.
1960 saw a modernization of the metric system, creating a worldwide standard of measurement that is widely accepted for global use today.
The oil crisis of the 1970s created a push for more economical transportation and environmentally sound practices, promoting the development of ever more sophisticated testing equipment for an increasingly vast array of subject matter.
The International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) were established in the mid 1990s to provide international standards regarding nearly any quantifiable subject.
The advent of the Computer Age gave rise to CAD technology. Geometric dimensioning and tolerances (GD&T) utilized a method of translating symbols on engineering drawings and models to communicate precision, accurate dimensions to computerized manufacturing machinery to produce faithful three-dimensional copies. Dimensioning refers to defining the geometry of a theoretically perfect object. Tolerancing defines the allowable variations in the form or size of the actual finished product.
The continued push for advances in and integration of environmental and manufacturing technology and travel beyond the scope of the planet will ensure the need for further advancements in measurement and calibration technology.
What is Calibration?
Metrology, or the science of measurement, facilitates the ability to have fair trade and international commerce, as it provides agreed upon standards. Its societal impacts affect utilities, the environment, economics, healthcare, and manufacturing, as well as consumer confidence.
Metrology is based on three interconnected principles: definition of a unit to be measured, realization of the measurement, and traceability to a quantitative rating.
A unit of measure can be a degree of temperature, time, or longitude. It can be a unit of distance, from micrometers to light years. It can measure mass, weight, volume, flow, and strength. Once the unit is defined, it can be quantified on a scale, and a rating can be applied. The rating is compared to a baseline of similar testing, creating traceability of the measurement.
Standardized instruments to measure the units must be invented to ensure correct specifications and to detect defects, anomalies, or wear patterns. These instruments are calibrated to a set standard and can "read" the unit to be measured to check for any irregularities.
The testing equipment that is used must be tested periodically to make sure that it is still providing the correct specifications. The scheduled time for this testing is known as the calibration interval. In addition to regular maintenance, if the calibration device experiences an uncommon environmental condition, such as extreme temperature, humidity, or shock, it should be recalibrated.
Calibration Devices
- Load Cells
- Transducers that convert the acting force into an analog electrical signal that provides a reading of deformation. The load cell is calibrated by attaching a pre-standardized device that provides a separate reading from the transducer when the cell is loaded. If both readings are not the same, the load cell is adjusted and another reading is taken. This process may be repeated until the load cell is in precise calibration with the testing device.
- Because the load cell is typically an integrated part of a larger system, follow-on steps in the process may be exponentially affected by a mis-calibrated piece of equipment, causing catastrophic failure of the product or the line. Accurate readings provide traceable calibration to assure proper function of equipment and machinery.
- Instrument Calibration
- Used to adjust and maintain accurate readings from electronic measuring devices. The electronic signals are measured with calibration tools that are set to manufacturer's specifications. Electronic measuring devices that require periodic calibration include weighing scales, acoustic and vibration test equipment, lasers, industrial ovens, and speedometers.
- Strain Gauges
- A sensing element in a sensor. The most common consists of a resistive foil pattern on a backing material. When stress is applied to the foil, it will deform in a predetermined way. It is the main sensing element for a variety of sensors, including torque sensors, position sensors, pressure sensors, and force sensors.
- Data Acquisition
- Sensors can convert electronic signals into physical parameters which are assigned digital values. This process allows for the conversion of information into an electronic stream that provides fast, uninterrupted communication. The flow of the data, and the electricity that powers it, must be carefully regulated to avoid overloading the system.
Calibrating Through Sensors
Sensors can detect and record measurements of sound, vibration, and acoustics through the use of geophones, microphones, hydrophones, and seismometers.
- Chemical Sensors
- Can detect the presence or absence of substances in a liquid or gas. Carbon monoxide detectors and breathalyzers are common chemical sensors.
- Automotive Sensors
- Measure everything from oil, water, and air pressure, to camshaft and crankshaft positions, wheel speed, engine coolant temperature, fuel level and pressure, and oil level and pressure. There are airbag sensors, mass air flow sensors, light sensors, and temperature sensors to ensure passenger comfort and safety.
- Proximity Sensors
- Can detect an undesired presence and set off an alarm if the presence crosses specified boundaries. These are most commonly found as motion detecting lights and car alarms, but can apply to excessive moisture or sound levels.
Instruments that Calibrate
Instruments that measure elements of the environment provide information on moisture and humidity levels, temperature, air flow and quality, soil content, environmental conditions for flora and fauna, and tidal conditions.
There are instruments that read levels of radiation, ionization, and subatomic particles. These include Geiger counters, radon detectors, dosimeters, and ionization chambers.
Navigational instruments such as a compass, gyroscope, or altimeter are delicate tools that require frequent calibration to maintain accurate readings.
Gauges That Calibrate
- Optical Gauges
- And measuring instruments may be sensors that detect, translate, and transmit quantifiable levels of light, color, or heat, into an image that can be read. Some optical gauges are scintillators, fiber-optic sensors, Infrared sensors, LED sensors, photo switches, and photon counters.
- Temperature Gauges
- Not limited to a simple mercury or alcohol bulb thermometer. Temperature measuring devices may be bimetal strips, calorimeters, flame detectors, thermocouples, or pyrometers.
Calibration Images, Diagrams and Visual Concepts
Types of Calibration
Calibration instruments may be handheld, portable, or fixed testing equipment. Hand held units are manually operated, compact units that can be used in house or for on-site calibration. Portable units are equipped with carrying handles or wheels, and are made to be moved to the piece of equipment being calibrated. Fixed calibration stations are permanently mounted and provide the most accurate readings.
ISO IEC standards provide a baseline for calibration of all measuring devices. The type of calibration process that is needed varies with the type of testing equipment that requires calibration.
Calibration services are available to provide test facilities, equipment, or to maintain and systems already in use. A good service provider will be able to offer advice and information on ISO IEC standards that may apply.
- Electrical Calibration
- Provides measurement of time, voltage, current, resistance, inductance, capacitance, radio frequency (RF), and power.
- Dimensional Calibration
- Uses a wide variety of tools and gauges, such as dial indicators, calipers, micrometers, and dial indicators to measure physical dimensions.
- Mechanical Calibration
- Measures weight, tension, compression, and torque.
- Physical Calibration
- Uses state-of-the-art equipment to measure temperature, humidity, vacuum, and pressure in a controlled environment, using air, hydraulic, dial, and digital gauges and high accuracy pressure calibration.
- Equipment Calibration
- The process through which pieces of equipment are adjusted for precision.
- Hardness Tests
- Evaluate the hardness and/or tensile strength of a material.
- Machine Calibration
- The process of adjusting machinery to a set of known standards to increase the accuracy and precision of the operation.
- Pipette Calibration
- Tests pipettes to ensure that they are able to contain and dispense precise volumes of fluid.
- Torque Wrench Calibration
- The process of adjusting torque wrenches so that the amount of force applied is displayed correctly on the tool.
Industries that Use Calibration Services
The automotive industry is among the most diverse consumers of calibration instruments. The design and manufacture of automobiles requires the measurement of aerodynamics, weight, pressure, stress, strain, shear, torque, force, speed, electricity, comfort, economy, ecology, load capacities, intake, and exhaust, among other things. The tools that measure these values range from pencil sized gauges to wind tunnels and multi-mile test tracks.
Other industries that require calibrated measuring instruments include electronics, aerospace, aeronautics, meteorology, construction, manufacturing, food service, medical, energy, and entertainment. The measuring devices used for these industries vary widely. They can include instruments that quantify time, distance, light, sound, air movement, temperature, electronic impedance, data acquisition, strength of components, ingredients, and pressure.
Calibration Services Terms
- Accuracy
- A tolerance limit that defines the deviation between an output’s measurement and the actual output.
- Alignment
- Adjustments that bring a device to proper operation.
- Analog Measurement
- Measurement device that creates a continual output reading of the internal input signal.
- Axial Strain
- A strain on the same axis as the applied load or in the same direction of the load applied.
- Calibration Curve
- A record of the comparison of a device’s output to the result of standard tests.
- Calibration Laboratories
- Companies that provide calibration services.
- Capacitor
- A storage device of energy.
- Compensation
- Using various devices, materials and processes to reduce known errors of a source.
- Equilibrium
- A state of balance or a steady state not undergoing change.
- Fatigue Limit
- The maximum amount of stress and deformation an object can handle.
- Hertz (Hz)
- The measurement of frequency in cycles per second.
- K-Factor
- The harmonic content of load current, which determines the safe maximum load on a power source.
- Mean Stress
- The difference between the maximum and minimum stress that an object can handle.
- Metrology
- The science of weight and measurement, or a system of weights and measurement.
- Nonlinearity
- The maximum deviation on a calibration curve from a straight line that is drawn among various outputs of a device, expressed as a percentage.
- Output
- The signal or measurement that is produced by a device.
- Range
- The span of values at which a meter or device will read accurately without overloading.
- Resistor
- Electrical load or impedance device.
- Resolution
- The minimal change of output in a device that is detectable.
- Torque
- The measure of force applied that causes rotational motion.