Medical Autoclaves: Reducing Risk In Batch Sterilisation

Medical Autoclaves: Reducing Risk In Batch Sterilisation

Medical autoclaves and steam sterilisers rely on accurate, precise and reliable temperature measurements to comply with process requirements. New sensor technology automates recalibrations to reduce risk between intervals and provide audit-proof batch certification.



  • automated process verification
  • FDA 21 CFR Part 11 compliance
  • accurate process control
  • reduced uncertainty batch-to-batch
  • cost and time savings




Standard practice in medical autoclave operation is to monitor temperature in the coldest part of the process by placing a sensor instrument in the drain or near the bottom of steam sterilisation equipment.

As measuring instruments are inevitably subject to ageing-induced drift phenomena or mechanical damage, periodic recalibration is necessary to guarantee reliable process monitoring. Should an instrument fail, there is no reliable way to quickly pin-point the timespan and batch(es) affected in between calibration cycles, triggering a lengthy and costly troubleshooting process.

GMP rules do not prescribe specific calibration intervals; The frequency typically range from 6 to 12 months and is usually defined by company-specific standard operating procedures (SOPs). However, this method involves process interruption, manual intervention, instrument removal and associated risks such as mechanical damage.

Our customer, a global healthcare company, has sought a new approach for its operations at a sterile facility in Germany: replacing the manual task by automatic in-situ recalibration between each batch (i.e. each time goods are loaded/unloaded).

Keeping the instrument in place effectively eliminate the risk of mechanical damage related to manipulation and saves valuable resources while increasing process compliance. Endress+Hauser has provided a sample instrument that fulfills the application requirements:

  • traceable calibration documentation
  • FDA compliance
  • DIN EN 285 compliance
  • hygienic design
  • minimum insertion length
  • minimum response time


Our solution

A self-calibration method that uses the Curie temperature (Tc) of a reference material as the built-in fix point temperature reference.

This physical principle guarantees that the reference material is not subject to change (i.e. fix point). By design, the cell is also protected against chemical contamination inside the sensor tip itself. Because the Tc of the reference material is a constant, it is used as a traceable calibration reference.

A trial phase has successfully demonstrated that the self-calibrating instrument performed well above expectations. On average, during it’s one month installation period and 600 operation hours, the instrument performed approximately 80 successful in situ self-calibrations. This equates to an average of nearly two batches and two calibrations per day.


Curie temperature

Once the reference material reaches the Tc, the material undergoes a phase change associated with a change in its electrical properties (capacity). The self-calibrating sensor’s electronics unit detects this change in properties automatically and compares the temperature measured by a Pt100 sensor – a resistance temperature detector with a resistance (R) of 100 Ω at 0 °C – with the known Tc (Figure 1).

Self-calibration is performed automatically when the process temperature drops below the nominal Tc of the device. A flashing green LED indicates that the self-calibration process is in progress. Once complete, the transmitter saves the calibration results in its built-in memory.

This in-line self-calibration makes it possible to continuously and repeatedly monitor changes to the properties of the Pt100 sensor and the electronics unit. Because the in-line calibration is performed under real ambient or process conditions (e.g. heating of the electronics unit), the result is more in line with the actual function than a sensor calibration performed under laboratory conditions.

The device’s data (process temperature, number of calibrations completed, calibration deviation factor) can be transferred directly to the process control system or to a suitable data manager capable of handling data in accordance with FDA integrity requirements.

A calibration certificate can be automatically created for each self-calibration. The automatically generated calibration certificate can be precisely assigned to every sterilisation batch, providing not only traceable documentation – proof that the temperature sensor is functioning correctly at that particular time – but also evidence of the sterility of the batch, since self-calibration is only completed if the temperature at the sensor also reaches the required sterilisation temperature.

The automatic self-calibration function requires slow temperature changes in the process. As shown in Figure 1, the optimum cooling rate for sensor calibration is between –0.5 K/min and –16.5 K/min. In the case of the steam steriliser, the calibration point of the self-calibrating sensor was 118 °C, which is very close to the sterilisation temperature of 123 °C (see Figure 2). The automatic calibration was therefore performed within the range of the desired sterilisation process parameters. The temperature elevated through the calibration point of 118 °C and crossed it on it’s way down during the cooling phase after sterilisation cycle, triggering the calibration process.



Typically in a steam steriliser, four to six temperature sensors are installed in different locations and for different purposes. For the study, the self-calibrating sensor was installed at the coldest point in the autoclave, next to an existing sensor to establish a second temperature reference (see Figure 3).

During qualification of a steriliser, temperature mapping is usually carried out to determine the worst positioning of the sensor. That position was determined to be on the chamber floor near the door.


Data analysis

Upon completion of the study, all data were collected and analysed. The probe was calibrated in an accredited calibration lab before and after the study. All 80 performed calibrations were successful, and the sensor accuracy was well within specified limits. The calibration results were more accurate than a class AA Pt100 sensor, considering that a state-of-the-art digital temperature transmitter typically adds uncertainty of ±0.1 K.

Neither the laboratory calibrations before and after the test, nor the trendline of the automatic calibrations showed any significant sign of wear or drift. Overall, the study was successful and the sensor was found suitable for sterilisation processes.

Methods and risks

To definitively assess the benefits of automatic in process self-calibration instruments, it is advisable to look at the method commonly used today. To check the accuracy of thermometers for hygienic applications, companies often use dry block calibrators for on-site calibration.

However, the manual method bears an often overlooked source of risk: Opening the devices, removing the insert, connecting and disconnecting electrical contacts, introducing the thermometer into the calibrator, or transporting the thermometer to the laboratory increases the likelihood of mechanical damage, such as from impact.

Furthermore, manual calibration procedures are always prone to a measurement uncertainty of approximately ±0.75 K, even when performed to industry standards by highly-skilled professionals.

A direct comparison between both approaches reveals the following: Given its far lower uncertainty of measurement, an in-process single-point calibration (±0.35 K) provides a more reliable statement of conformity than a manual check performed at three points using a dry block calibrator (±0.75 K), particularly for the critical temperature range around the sterilisation temperature; this conclusion is especially true if we consider whether calibration is performed manually once a year or automatically with every sterilisation cycle.

If the critical temperature sensor is working as expected, that would lead to more than 1,100 calibrations per year, not including the manual standard calibration completed periodically (e.g. once a year) as per standard operating procedures (SOPs).

Calibration automatically performed with every batch ensures that a damaged thermometer is promptly detected. If the sensor verifies its accuracy and the calibration counter has increased, this indicates that the sterilisation was successful. However, if the thermometer gives incorrect results, a warning message is generated by the transmitter, immediately alerting the user of a problem with the current batch. This batch can subsequently be discarded or repeated, assuming a second cycle is possible.

In contrast, if normal calibration intervals used in conventional systems (e.g., once a year) are used, a thermometer identified as faulty after a manual calibration cannot be linked to a single batch. Instead, all batches that have been sterilised since the last calibration event have to be incorporated into the deviation investigation. This results in complex root-cause analyses and, at worst, product recalls, causing considerable expense and damage to the brand.


Valuable data

Self-calibrating thermometers, when connected to a modern process control system or secure data manager, can provide other data in addition to temperature measurement values. Using the HART protocol, it is also possible to collect ‘calibration counter’ and ‘last recorded calibration deviation’ event values. When these values are continuously queried, an alarm can be generated if the calibration deviation exceeds an established limit. The date and time of the calibration can be checked in a connected system because the deviation is marked at the moment when the calibration counter increases by 1.

With this technology, it is possible to generate an online calibration certificate that can be viewed any time on site, in the network or even in a secure cloud.



The study conducted in a medical autoclave showed successful results concerning the implementation of a self-calibrating thermometer in sterilisation processes. The overall process control was increased, which should be a main goal for any pharmaceutical company.

Some considerations regarding cost have been assessed. For a typical application, the return on investment should be reached after approximately 1.5 years, assuming all temperature sensors for one steriliser are replaced with self-calibrating temperature sensors.


Check these articles out:

Three Steps To Achieve Smart Machine Predictive Maintenance Benefits

Safety And Security In Industrial Plants Needs Rethinking Industry 4.0

Endress+Hauser Invests In Sensor Technology

Three Steps To Achieve Smart Machine Predictive Maintenance Benefits

When A Pandemic Accelerates The Journey To Digital Twins









How Digitalisation In Oil And Gas Is Creating Security Risks
Low Carbon Society For Kuala Lumpur: From Blueprint To Implementation