Question :
What is the Temperature Coefficient specification and how is it applied?
Réponses :
Temperature Coefficient is a DMM or meter specification that is added to the general accuracy specification when the measuring instrument is at an ambient temperature that is outside the normal operating temperature.
You should not confuse this FAQ with a resistor's temperature coefficient of resistance (TCR).
The specification is commonly defined as % of reading + % of range per degree, it is sometimes defined as % per degree. Either way, this means for each degree outside of the normal operating temperature, an additional error is added to the normal accuracy specification. In general, the Temperature Coefficient is much less than the normal accuracy. The normal operating temperature range will be included with your instrument specifications.
Temperature Coefficient may also be defined relative to the last calibration temperature, perhaps TCAL ± 5oC. This means you apply the temperature coefficient per degree outside five oC on either side of the calibration temperature.
Example:
Temperature Coefficient = 0.0001%/oC.
Accuracy = 20ppm of reading + 20ppm of range.
The accuracy in measuring a 0.5V signal on the 1V range for the 90 day specification is 30ppm or 0.003% or 30 microvolts.
For this instrument, say the standard operating temperature is 18-28o. Now consider that the unit is at 33 oC ambient temperature. Since this is five oC outside the normal operating temperature, the Temperature Coefficient is going to apply to the normal specification.
The Temperature Coefficient is stated as 0.0001% for each oC
0.0001% = 0.000001 = 1 ppm
5 x 1ppm = 5ppm (or 0.0005%)
5ppm added to the normal specification of 30ppm is 35 ppm or 0.00035% total uncertainty. For the 0.5 volt signal being measured, this results in 35 microvolts of uncertainty.
A reported reading in the range of 0.499965 V to 0.500035 V will be within tolerance.
Another Example:
Temperature Coefficient = 0.0001% of reading + 0.0001% of range.
Accuracy = 0.0020% of reading + 0.0006% of range.
The accuracy in measuring a 0.5V signal on the 1V range for the 90 day specification is 16ppm or 0.0016% or 16 microvolts.
For this instrument, say the temperature coefficient is based on TCAL ± 5oC. The instrument was last calibrated at the factory, which is specified to be 23oC in the specifications. Say we're currently measuring at 35oC. This is 12oC away from TCAL, but we only need to apply the temperature coefficient to 7oC based on the spec.
0.0001% of 0.5V + 0.0001% of 1V = 1.5ppm = 1.5 microvolts.
7 x 1.5ppm = 10.5ppm = 10.5 microvolts.
Adding this to the unaltered accuracy gives 26.5 microvolts.
A reported reading in the range of 0.499974 V to 0.500026 will be within tolerance.
Cette FAQ concerne :
Série du produit : Gamme Keithley 2000 : multimètre à 6,5 digits avec fonction balayage Multimètre à écran graphique avec 6,5 digits Keithley DMM6500 Multimètre numérique Keithley DMM7510
Produit :
Numéro de la FAQ 71941
Afficher toutes les FAQ »