calibration – inLiteTech https://inlitetech.com Your Tech support & Navigator Sun, 18 Jul 2021 13:30:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://inlitetech.com/wp-content/uploads/2021/06/cropped-cropped-3f2682645d8e490195ae7306fbc0f5cc-2-32x32.png calibration – inLiteTech https://inlitetech.com 32 32 WHY CALIBRATION REQUIRED? https://inlitetech.com/why-calibration/ https://inlitetech.com/why-calibration/#respond Sun, 18 Jul 2021 13:20:13 +0000 https://inlitetech.com/?p=436 You know that calibration is required for a new instrument. We want that the instrument is provide accurate indication or output signal when it is installed. But there is chances of instrument error occurs.

Instrument error can occur due to a variety of factors: drift, environment, electrical supply, addition of components to the output loop, process changes, etc. Since a calibration is performed by comparing or applying a known signal to the instrument under test, errors are detected by performing a calibration.

An error is the algebraic difference between the indicated value and the actual value of the measured variable. Typical errors that occur are as given below.

  1. Span error
  2. Zero error
  3. Combined zero & span error
  4. Linearization error

Zero and span errors are corrected by performing a calibration. Most instruments are having facility for adjusting the zero and span of the instrument, along with instructions for performing this adjustment.

The zero adjustment is used to produce a parallel shift of the input-output curve. The span adjustment is used to change the slope of the input-output curve. Linearization error may be corrected if the instrument has a linearization adjustment. If the magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced or repaired.

To detect and correct instrument error, periodic calibrations are performed. Even if a periodic calibration reveals the instrument is perfect and no adjustment is required, we would not have known that unless we performed the calibration. And even if adjustments are not required for several consecutive calibrations, we will still perform the calibration check at the next scheduled due date. Periodic calibrations to specified tolerances using approved procedures are an important element of any quality system.

Why we should calibrate?

  • Testing a new instrument
  • Testing an instrument after it has been repaired or modified
  • Periodic testing of instruments
  • Testing after the specific usage has elapsed
  • Prior to and/or after a critical measurement
  • When observations are not accurate or instrument indicators do not match the output of a surrogate instrument
  • After events such as:
    • An instrument has had a shock, vibration, or exposure to adverse conditions, which can put it out of calibration or damage it.
    • Sudden weather changes

Risk Involved in Not Calibrating an Instrument

  • Safety procedure: In case of instruments involving perishable products such as food or thermometers with area of sensitive nature, uncalibrated instruments may cause potential safety hazards.
  • Wastage: If the instrument is not perfectly calibrated, it might lead to potential wastage of resources and time consumed in the operations, resulting in an overall increase in expenses.
  • Faulty or Questionable Quality: If the instrument is improperly calibrated, the chances of faulty or questionable quality of finished goods arises. Calibration helps maintain the quality in production at different stages, which gets compromised if any discrepancy arises.
  • Fines or litigations: Customers who have incurred damage may return the product against a full refund, which is still alright; but if they go for litigation due to damages, you could be up for serious costs in terms of reputation and restitution payments.
  • Increased downtime: Poor quality of finished goods is the first indicator of disrepair in your equipment. Regular calibration programs identify warning signs early, allowing you to take action before any further damage is caused.

Loading

]]>
https://inlitetech.com/why-calibration/feed/ 0
CALIBRATION CHARACTERISTICS https://inlitetech.com/calibration-characteristics/ https://inlitetech.com/calibration-characteristics/#respond Fri, 16 Jul 2021 18:17:00 +0000 https://inlitetech.com/?p=430 Calibration Tolerance: All measurements should be made to specified tolerances. The words tolerance and accuracy are often misused. In ISA’s The Automation, Systems, and Instrumentation Dictionary, their definitions are as follows:

Accuracy:

The magnitude of the error of the total output scale or the error rate to the output, expressed in percentage time or percentage reading, respectively.

Tolerance:

Permissible deviation from specified value; can be expressed in units of measurement, space percentage, or reading percentages.

As you can see from the definition, there are subtle differences between the terms. It is recommended that tolerance, specified in the measurement units, be applied to the measurement requirements made in your facility. By specifying the actual value, errors caused by calculating the percentage of space or reading are eliminated. Also, tolerance should be specified in units of measurement.
For example, you’ve been assigned to do
previously mentioned 0-to-300 psig transmitter with calibration tolerance of ± 2 psig. Withdrawal tolerance can be:
2 psig
÷ 300 psig
× 16 mA
—————————-
0.1067 mA
The calculated tolerance reached 0.10 mA, because

close to 0.11 mA will exceed the calculated tolerance. It is recommended that the tolerance of both ± 2 psig and ± 0.10 mA on the measurement data sheet when distances to the milliamp signal are recorded.

Accuracy ratio:

This term has been used in the past to describe the relationship between test level accuracy and test metal accuracy. This term is still used by those who do not understand the uncertainty calculation (uncertainty is described below). A good rule of thumb is to ensure a 4: 1 accuracy ratio when performing measurements. This means that the metal or scale used must be four times more accurate than the tool being tested. Therefore, the test equipment (such as field level) used to measure process process should be four times more accurate than process process, laboratory standard used to measure field quality should be four times more accurate than field field, and so on.
With today’s technology, a 4: 1 accuracy ratio becomes much more difficult to achieve. Why the recommended 4: 1 ratio? A 4: 1 rating confirming will reduce the result of the level of accuracy to the full measurement accuracy. If a high level is found to be intolerable in pairs, for example, the measurements made using that standard are not significantly reduced.

Suppose we use our previous example of the test equipment with a tolerance of ±0.25°C and it is found to be 0.5°C out of tolerance during a scheduled calibration. Since we took into consideration an accuracy ratio of 4:1 and assigned a calibration tolerance of ±1°C to the process instrument, it is less likely that our calibration performed using that standard is compromised.

The out-of-tolerance standard still needs to be investigated by reverse traceability of all calibrations performed using the test standard. However, our assurance is high that the process instrument is within tolerance. If we had arbitrarily assigned a calibration tolerance of ±0.25°C to the process instrument, or used test equipment with a calibration tolerance of ±1°C, we would not have the assurance that our process instrument is within calibration tolerance. This leads us to traceability.

Traceability:

All measurements should be tracked to a level that is known nationally or internationally. For example, in the United States, the National Institute of Standards and Technology (NIST), formerly the National Bureau of Standards (NBS), maintains national standards. Tracking is defined by ANSI / NCSL Z540-1-1994 (which replaces MIL-STD-45662A) as a measurement asset that may be compliant with applicable standards, usually national or international standards, for a continuous series of comparisons. ”Note that this does not mean that the stock market needs to be measured by its standard levels. It means that the measurements made are followed by NIST at all levels used to measure standards, regardless of how many levels exist between the store and NIST.

Uncertainty:

Parameter, associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty analysis is required for calibration labs conforming to ISO 17025 requirements. Uncertainty analysis is performed to evaluate and identify factors associated with the calibration equipment and process instrument that affect the calibration accuracy.

Loading

]]>
https://inlitetech.com/calibration-characteristics/feed/ 0
What is calibration? https://inlitetech.com/calibration/ https://inlitetech.com/calibration/#respond Fri, 16 Jul 2021 18:14:12 +0000 https://inlitetech.com/?p=420 According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.”

calibration is a comparison of measuring equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the instrument being compared.

Typically, calibration of an instrument is checked at several points throughout the calibration range of the instrument. The calibration range is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and upper range values.” The limits are defined by the zero and span values.

The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values. The calibration range may differ from the instrument range, which refers to the capability of the instrument. For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as

0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value is 4 mA. The input span is 300 psig and the output span is 16 mA. Different terms may be used at your facility. Just be careful not to confuse the range the instrument is capable of with the range for which the instrument has been calibrated.

Loading

]]>
https://inlitetech.com/calibration/feed/ 0