Pressure gauges – accuracy explained

What does pressure gauge accuracy mean?

 

Pressure gauge accuracy is the difference (error) between true value and the indicated reading expressed on the pressure gauge, as a percentage of full-scale span.

 

For a mechanical pressure gauge accuracy is defined as a percentage of the full-scale range. While requirements differ from one industry to another, the following are general guidelines:

 

  • Test gauges and standards: 0.25% through 0.10% full scale accuracies
  • Critical processes: 0.5% full scale accuracy
  • General industrial processes: 1.0% accuracy. Less critical commercial uses: 1,6% or 2.0% accuracy. Refer to BS EN, ASME B40.100 or the DIN specifications for more information on accuracy.

 

The accuracy class of a gauge is shown on the dial as Acc, Class, Cl or Kl (e.g. Cl 1.0 for 1% of full-scale value).  For example, a gauge that has a scale of 0-100 bar with an accuracy of 1% would mean that the gauge is accurate to within 1 bar across its full range.  Other common accuracy classes are shown below:

 

Accuracy class error margin
(Percentage of measurement spread)
0.1± 0.1%
0.25± 0.25%
0.6± 0.6%
1± 1%
1.6± 1.6%
2.5± 2.5%
4± 4%

 

 

The resolution of the gauge must be adequate to display the accuracy of the gauge.  If the gauge lacks the resolution to display the advertised accuracy, the user should reduce the accuracy to match the resolution displayed on the dial.

 

It is important to match the specifications of a gauge to its intended application.  Installing a gauge with inadequate accuracy can lead to flawed measurement data, whilst installing a gauge with excessively high accuracy increases the cost to purchase, calibrate and maintain that gauge.

 

For information, please contact sales@brannan.co.uk 

 

 

Download PDF


Subscribe to our newsletter

To submit this form, you need to accept our Privacy Statement(Required)