Due to the volume of content involved, this may take a few minutes to load!

Sign up for free today!

Sign Up

Premier Online Food Safety and Quality Management Resource

Calibration of Measuring and Testing Equipment

Development – To define detail, scope and purpose.

User Uploaded Image Let us help you to develop integrated Food Safety and Quality solutions that level the playing field and get you ready to control your hazards. In fact, we're experts in helping you to establish effective systemic tools, drive continuous improvements and show you where your most effective outcomes originate.
Website: https://alimentex.com/
Sales Contact Person: Aron Malcolm
Sales Contact Email: achievegreatness@alimentex.com

Training participants will gain a basic understanding of Calibration of Measuring and Testing Equipment and its applications within food safety and quality systems. Basic knowledge competency will be verified through successful completion of the accompanying Calibration of Measuring and Testing Equipment assessment activity. Basic skill competency can be verified through the Calibration of Measuring and Testing Equipment competency checklist available as a resource for this training activity.

Key Definitions For Calibration of Measuring and Testing Equipment
- Calibration Adjustment: A calibration adjustment is a physical adjustment made to a piece of equipment to ensure it is accurate. This is often conducted after a device has been calibrated.
- Calibration: Calibration is the validation of specific measurement techniques and equipment. At the simplest level, calibration is a comparison between measurement; One of known magnitude or correctness made or set with one device and another measurement made in as similar a way as possible with a second device.
- Infrared or IR: Infrared radiation is electromagnetic radiation with a low frequency light below the visible spectrum - It can’t be seen unaided with the human eye. 
- pH: From potential of Hydrogen : The logarithm of the reciprocal of hydrogen-ion concentration in gram atoms per litre; provides a measure on a scale from 0 to 14 of the acidity or alkalinity of a solution where 7 is neutral and greater than 7 is more alkaline and less than 7 is more acidic.
- Standard Weight Reference: A Standard Weight Reference, commonly also known as a test weight, is a standardized, verified and validated example against which you can confirm the accuracy of a scale or other weighing device.

Calibration of Measuring and Testing Equipment Development 
When considering the development, documentation and implementation of Calibration of Measuring and Testing Equipment within food safety and quality management systems, the following information should be considered to ensure effective outcomes:

Inaccurate equipment used to measure or test product or processes can compromise food safety regulatory requirements and the safety and quality food products. Calibration programs are implemented to ensure measuring and testing equipment are accurate and capable of facilitating their intended requirements.

Calibration is a technique used to define the appropriate operational parameters of measuring and testing equipment. This is usually done in two steps:
- By testing, this involves measuring and comparing the "actual" operational parameters of the equipment or device against a defined standard or reference device. A procedure for the methods used should be documented.
- The result of the test provides variation value, which can be used to adjust the operational parameters of the equipment or device being tested. A re-test is usually initiated to ensure the correct accuracy of the adjustment. The accuracy of the calibrated equipment or device must be relevant to the risk based context in which it is used, and should be within the accuracy nominated within relevant regulatory or industry standards.

What does Calibration Involve?
Calibration of Measuring and Testing Equipment involves comparing the accuracy of such devices as compared to a reference device or a specially constructed and certified piece of calibration equipment. Calibration involves making a number of observations over the defined operational range to which the measuring or testing equipment will be subjected to, and comparing the outcome device’s reading to the true temperature value. The outcomes of calibration will allow assessment and verification of specific temperature measuring equipment as part of your Food Safety System. Calibration of temperature measuring equipment is essential to ensure that a device is performing accurately.

Depending of the type of equipment or device to be calibrated, it may be calibrated internally by the food business, or externally by an appropriate service provider. Calibration is commonly considered a component of the maintenance program.

Measurement and testing equipment devices that may require scheduled calibration include:
- Oven;
- Smokers;
- Temperature probes including thermometers, thermocouples and thermistors;
- Infrared temperature devices including radiation pyrometers;
- Chiller and freezer temperature gauges;
- Refrigerated vehicle temperature gauges;
- Hot holding equipment including bain maries and hot boxes;
- Scales;
- pH meters;
- Refractometers;
- Retorts;
- Chemical dispensing units, including spraying and dosing units for chemicals used in primary production;
- Chlorine measuring equipment;
- Optical and colour measuring equipment;
- Pressure sensors;
- Heat sensors and thermostats;
- Penetrometers;
- Laboratory incubator;
- Labelling devices;
- Metal detectors;
- Magnets.

Thermometer Calibration 
Thermometer Calibration has the obvious benefit of determining whether or not your temperature measuring devices are registering a true temperature reading. For a food safety system to remain verified, it must use temperature measuring devices that can accurately measure the temperature of potentially hazardous foods to plus or minus 1 Degree Celsius or 1 Degree Fahrenheit. If you calibrate your thermometers internally within your food business, it is also important to ensure that the reference thermometer to which you compare other temperature measuring devices is accurate and frequently calibrated to recognised standards.

Refrigeration Maintenance and Calibration 
Whether monitored by external or internal parties, refrigeration plays a crucial part in any food safety program. The following protocols should be considered regarding the maintenance of refrigerated systems:
- Any required repairs or related incidents must be reported to appropriate staff in a manner that constitutes objective corrective actions;
- Refrigeration service schedules should be carried out in accordance with legislation and industry specific guidelines;
- If your business uses temperature control as a major critical control point, you may need to pay more attention to refrigeration maintenance than other businesses that don’t rely on refrigerated equipment as much;
- Fan, extraction and condenser units should be regularly cleaned to exclude dust and grease build up.
- Door seals must be maintained so as not to reduce the capacity of the equipment to retain appropriate temperatures nor to provide a breeding ground for bacterial pathogens;
- The calibration of all fitted temperature measuring devices should be carried out annually;
- Refrigerated units should not be overfull with product. This reduces the chance of a breakdown and allows airflow between the products, enabling the unit to work to capacity when bringing product temperatures down, or keeping it at optimum levels;
- Preventative maintenance and required repairs should be undertaken as scheduled, and all dealings must be documented as part of the food safety program.

Type of Calibration 
Calibration activities can generally be classed as either absolute or local, according to their methods:

Absolute Calibration 
Absolute calibration is when readings are systematically measured against an electronic instrument that has been calibrated according to controlled methods and against recognised standards. Governmental authorities around the world maintain the scientific standards relating to the calibrating of the calibration equipment itself. Absolute calibration is usually completed in a manner which is traceable to scientific standards, and is indicative of a true result.

Local Calibration 
In some circumstances where it may not be possible to perform an absolute calibration, a local calibration, also known as a comparative test can be performed to ascertain the working accuracy of a measuring or testing device. The feasibility of such testing relies on the calibration of the equipment being used to compare measurements to and the control of variables which may alter readings.

Laboratory Testing Equipment and Materials 
Requirements for laboratory testing equipment and materials should include:
- Suitable size, construction, condition and location of equipment used for testing and for controlling environmental conditions under which testing is conducted;
- Inspection, cleaning, sanitation and maintenance requirements for testing equipment;
- Calibration schedules, procedures and records for relevant equipment;
- Provisions to ensure equipment and materials used in testing do not interfere with the testing procedures or outcomes;
-  Identification, traceability, description and intended use information for reagents used for testing conducted at the site.

Thermometer Calibration 
Thermometers are perhaps the most commonly calibrated of food industry measuring and testing devices. The following are examples of how to complete local calibrations for temperature measuring devices.

For ice point calibration, blend crushed ice with enough water to make a blended ice and water mixture for maintaining the ice point temperature. Stir the mix continuously and do not let the thermometer stem or sensing element touch the bottom or sides of the container during immersion. Allow the thermometer display to remain stable for a minute, and then read the temperature. The temperature should read 0 Degrees Celsius or 32 Degrees Fahrenheit.

For boiling point calibration, make sure the water is a rolling boil, and the vessel containing the boiling water is sufficiently deep enough to remain unaffected by air cooling.  Place the stem of the thermometer probe into the centre of the boiling water, do not it touch the bottom or sides of the container. Allow the thermometer display to remain stable for a minute, and then read the temperature. The temperature should read 100 Degrees Celsius or 212 Degrees Fahrenheit.  Ensure that all necessary safety precautions are taken when adopting this method to prevent and avoid scalding.

Contact Thermometers 
For maximum accuracy, temperature measurements have historically been made with devices that directly contact food. Today, contact thermometers come in a variety of shapes and sizes, and employ various technologies. One thing they all have in common is a contact surface, or probe, which actively senses the heat of food it touches. The sensor may be a liquid that expands and contracts; a bimetal, or combination of two different metals that expand or contract at different rates; a “thermocouple” that generates a small voltage; or a thermistor that varies its electrical resistance, all in proportion to the applied heat. The sensor reacts and the temperature is indicated on a dial, gauge or display.

The compact size of most of the devices enables them to be carried in users’ pockets, readily available to check the temperature of a variety of food items. Contact thermometers have several drawbacks, the largest of which is a delay in reaction time. For example, anyone who has waited for a final temperature reading from a meat thermometer knows that its response time can be slow.

Another problem with contact thermometers is the need to avoid cross contamination. When taking multiple temperature measurements, users must clean and sanitize probes that come into contact with food, hands and other surfaces that may transmit unwanted micro-organisms to other food being checked. Use of infrared thermometers, however, avoids these problems.

Infrared Thermometers
Infrared or IR is the name given to a range of electromagnetic wavelengths longer than visible light but shorter than microwaves. Infrared technology is used extensively today, most notably in television remote controls. Infrared devices do everything from keeping French fries warm at fast food restaurants to helping airborne rescuers locate crash survivors. The technology is well developed and, with advances in optics and electronics, it’s a technology that’s becoming more accurate and cost-effective for users. To measure a surface temperature, the user aims the Infrared Thermometer at the target food, presses a button and reads the temperature display. The device has an optical lens that collects the radiated infrared energy from the object and focuses it on the detector. The detector converts the energy into an electrical signal that’s amplified and displayed as a temperature reading. An infrared thermometer measures temperature by sensing the magnitude of radiated energy at infrared frequencies. Using this data and the actual temperature of the detector, the thermometer calculates the temperature of the surface that emitted the energy.

Since air is essentially invisible at Infrared frequencies, the infrared thermometer is able to measure food surface temperature without contacting food. This reduces the risk of cross-contamination. It also saves time by eliminating the thermal lag-time necessary when a contact thermometer’s probe must heat up or cool down after contacting food. As prices for quality Infra Red thermometers have dropped, and have become affordable for most users, an increasing number of manufacturers are producing and selling thermometers with varying capabilities and limitations.

Non-contact Infrared thermometers are quickly becoming an integral part of Food Safety and Quality routines. Infrared thermometers quickly register a surface temperature, which facilitates general food safety system surveillance by allowing the scanning of numerous food temperatures over a short period of time.

Quality Infrared thermometers assure the greatest accuracy in the food critical zone called the Danger Zone, where harmful bacteria grow most rapidly. The term Danger Zone describes temperatures above 5 Degrees Celsius or 40 Degrees Fahrenheit and below 60 Degrees Celsius or 140 Degrees Fahrenheit. If potentially hazardous foods such as dairy products, seafood, meats and eggs remain within this range for more than a short time, food borne bacteria can multiply and create toxins that may cause food borne illness if consumed. While food safety inspectors enforce adequate precautions, the owners, operators, and staff of a business bear the everyday responsibility of providing healthy quality food, which means an awareness and plan of action to ensure foods are kept out of the temperature danger zone.

Simple precautions and continuous monitoring will help ensure safe and healthy working and eating environments. A non-contact thermometer can assist you in taking food temperatures quickly and accurately. With an average half a second response time, you can take multiple readings in rapid succession. For example, you can scan a food service buffet to ensure hot and cold holding areas are at appropriate serving temperatures. Optimised for use with organic products and equipped with a better filter detector to look through water vapour more effectively, specialised Infrared thermometers cater to the requirements of the Food Service Professional. Quick, simple scans with an Infrared Thermometer save time and money by instantly locating potential problems.

Infrared Thermometer Limitations
As mentioned earlier, infrared energy is emitted from the surface of food. However, the surface temperature of food is not always the same as its core temperature. Anyone defrosting or cooking a dense meal in a microwave oven is familiar with this property of food. There is a real danger in placing too much reliance on a surface temperature reading when food at the core can remain in the danger zone.

Accuracy problems can also be caused when infrared thermometers are used in uncontrolled environments to measure low temperatures. When food approaches the freezing point of water, the low infrared energy emitted from its cold surface may be quite difficult for the optical detector to distinguish from the background environment. Coupled with thermometer optics that are more likely to be at a higher, room ambient temperature, there is a significant chance that inaccurate readings will be obtained unless the device is used precisely as intended by the manufacturer.

Steam or other vapour that comes between food and an Infrared thermometers’ optics can also cause erroneous readings as well, as can frost and food packaging materials. An inaccurate Infrared thermometer reading can also occur when a user takes a thermometer from the ambient temperature in one room and uses it in another room, without allowing the device to stabilize at the new ambient temperature. Known as thermal shock, this change in ambient temperature confuses the thermometer as it tries to produce a result while the temperature of the individual components of the optics is changing.

These and other potential limitations of use keep infrared thermometers from being accepted as enforcement tools. However, when used as intended, the devices provide quick, accurate temperature readings, making them sufficient and effective screening tools. If necessary, a reading by an Infrared Thermometer can be followed by a measurement using a contact thermometer to check a food’s internal temperature. Typical applications for Infrared Thermometers include measuring surface temperatures of food in hot or cold food service buffets, where multiple consecutive readings of unpackaged, accessible food can be taken quickly. With this screening completed, the user can follow up with a contact thermometer whenever questionable readings are obtained.

Thermometer User Information 
Since users of infrared thermometers play a critical role in the proper and effective operation of the device, they must be adequately informed about the capabilities and limitations of the thermometer. Information is usually included with each thermometer, to help educate users about the device, including:
- Proper application and operation;
- Screening and surface measurement;
- How to switch between temperature scales such as Celsius or Fahrenheit;
- Operating temperature range;
- Ambient temperature range for use;
- Field of view, minimum “spot area” and operating distance;
- Angle of measurement;
- Thermal shock and minimum stabilization time;
- Adjustability of emissivity;
- Effects of packaging on readings;
- Use on surfaces other than food;
- Accuracy specifications, resolution and repeatability;
- Cleaning instructions;
- Storage temperature and other storage conditions;
- Response time;
- Battery life;
- Battery voltage and type.

Coolroom and Freezer Calibration 
The calibration of fixed devices located in coolrooms and freezers may be completed using the following method in the nominated sequence:
- Using an accurate reference device or reference thermometer, measure the temperature near the internal probe located inside the chiller or freezer.
- Compare the recorded temperature to that which is shown on the display for the fitted device. This display is usually found on the outside of the chiller or freezer to which it is fitted.
- If the readings are not within plus or minus 1 Degree Celsius or 1 Degree Fahrenheit of each other, there may be a requirement to have the units adjusted or serviced as per the manufacturer’s specifications.

Though this is a simple method for rating the accuracy of temperature measuring devices, scientific based principles regarding the correct use of thermometers and other temperature measuring devices must be adhered to gain the most accurate results.

Calibration of Heating Units 
Heating units may include:
- Hotboxes;
- Ovens;
- Hot Smokers;
- Bain Maries;
- Fryers;
- Kettles;
- Belt Grills;
- Steamers;
- Microwave Units.

Calibration of these devices is important, particularly where the devices are used for hot holding. To provide verification for the devices’ capability to hold foods at appropriate temperatures, records of calibration and servicing must be maintained. Due to the sensitive nature of the device’s operation and the associated potential for foods to become unsafe if not held at correct temperatures, it is vitally important that required repairs be carried out before the equipment is used again. Providers of external calibration services should be included within the approved supplier program.

Calibration of Scales 
Scales can easily be calibrated internally by using a standard weight reference. When choosing a standard weight reference to use, normally a weight similar to the product weight that the scales is used to measure is advisable. For example, you might use a 100 gram or ounce standard weight reference for a scale used to measure a 110 gram or ounce product. In this example, it would not generally be appropriate to use a 1000 gram or ounce standard weight reference for a scale used to measure a 110 gram or ounce product.

Internal calibration is recommended daily for food businesses manufacturing or selling product by net or gross weight within in the retail sector.

A five point calibration of the scales can be done by first zeroing the scale, then placing in turn the weight on each of the 4 corners of the scale platform and then in the middle. The read out should be the same as the weight at all 5 points checked.  An acceptable degree of accuracy is generally for the business or customer to decide, but plus or minus 1% of the weight of the unit being weighed is normally considered to be an acceptable tolerance.

It is also recommended at least annually, that all scales be calibrated by a certified external service provider. In this regard, it should also be noted that Fair Trading Laws in some countries and regions require that trade scales be externally certified by an accredited Service Provider.  We recommend that you refer to your local Fair Trading Government website for specific requirements.

If your food business supplies foodstuffs manufactured to a customer’s specifications, it is important to consider any specific Calibration of Measuring and Testing Equipment Development requirements in relation to their items.

Calibration of a pH Meter 
It is generally considered best practice for pH meters to be calibrated prior to each session of use. Each pH testing device is slightly different in accuracy, which also changes with the age of the device. Un-calibrated pH testing devices can give results that are not accurate; the outcome of which can potentially cause food safety hazards within products, particularly where acidity is used to control potential microbiological growth within foods. The calibration of pH testing devices is also commonly known as pH electrode calibration, as parameters set are not device dependent, but electrode dependent.

The process of calibrating pH testing devices calibration procedure calls for use of two or three pH calibration buffers. These buffers are commonly liquids and have a controlled and known pH. To calibrate the pH testing device, the pH electrode is dipped into the pH calibration buffer and the displayed pH on the device being tested is adjusted to meet the known pH of the calibration buffer.

Depending on the type of pH testing device being calibrated, it may either recognize the pH buffer automatically or perform the calibration adjustment automatically. Alternately, most pH units can also be adjusted manually to ensure they are accurate in their measurements.

Chemical Dosing 
Dosing involves the mixing and application ratios and amounts of chemicals applied. The critical factors in ensuring foods will not pose any safety risk to consumers in this context is adhering to science based mixing and application rates for chemicals. These ratios are usually defined within the chemical manufacturer's instructions for use, in conjunction with best practice, regulatory and industry requirements.

Mixing ratios are the dilution rate for chemicals, which are generally mixed with water for application. These are formulated on scientific data to ensure that the chemical application is not only successful regarding its intention, but also to ensure that any risk for chemical contamination is controlled.

Application ratios define how much of the diluted chemicals are applied within a specific area, or onto a specific crop. These can be justified by the scheduled calibration of chemical application equipment, which may be documented on a Chemical Application Equipment Calibration Log.

If your food business supplies foodstuffs manufactured to a customer’s specifications, it is important to consider any specific Calibration of Measuring and Testing Equipment Development requirements in relation to their items.


The "General Content" pages within haccp.com include current and relevant information broken into the following elements: Develop, Document, Implement, Monitor, Corrective Action, Verify, Validate and Skills and Knowledge.

You can use the tabs provided at the top of the page body to navigate between these elements!

haccp.com is Your Food Quality, Food Safety and Food Risk Management resource! We encourage your participation in recommending content addition, updates and adjustment to ensure you access to the most current and relevant Food Safety and Quality information and resources available on the web. Please don’t hesitate to contact us with your suggestions.

Subscribe