An instrument may be calibrated by comparing its reading with that of a more accurate instrument when both are measuring the same quantity. The more accurate instrument is used as a standard for comparison purposes.
The figure below illustrates the method for calibrating a dc voltmeter.
An appropriate dc voltage level is applied from the power supply to the parallel connected standard instrument and the instrument to be calibrated. The voltage is adjusted over the desired range, and the two instrument readings are noted.
A calibration chart should be prepared when calibrating an instrument. An example of such a chart is illustrated by the table below:
Table 1.0: Calibration Chart
Scale reading | Precise voltage | Correction |
100 | 104 | +4 |
90 | 91 | +1 |
81 | 83.5 | +2.5 |
70 | 72.5 | +2.5 |
60 | 62.1 | +2.1 |
50 | 51.8 | +1.8 |
40 | 42.3 | +2.3 |
30 | 31 | +1 |
20 | 19.7 | -0.3 |
10 | 9.8 | -0.2 |
0 | 0 | 0 |
When a digital instrument is calibrated and found to be inaccurate, a correction may be made by adjusting a variable resistor within the circuit. For both digital and analog instruments, the preparation of a calibration chart, can be useful for determining whether or not the instrument is within its specified accuracy.
Recommended: The Ultimate Guide to Electrical Maintenance
Learn more on: Instrument Calibration Equipment
Related articles:
The importance of printed circuit board (PCB) technology has escalated throughout the years with the…
One of the key challenges in measuring the electrical current in high voltage, high power…
The Concept behind Wiegand Effect Based Sensors The Wiegand effect technology employs the unique…
An accelerometer is a sensor that is designed to measure acceleration or rate of change…
The USB-6009 is a small external data acquisition and control device manufactured by National Instruments…
X-Y tables are utilized as components in many systems where reprogrammable position control is desired.…
View Comments