Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations cowski on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

DP Flowmeter Terminology 1

Status
Not open for further replies.

sheiko

Chemical
May 7, 2007
422
Hello,

Could you please help me clarify the differences between the following terms:

- range?
- span? calibrated span?
- full scale?
- Max DP


"We don't believe things because they are true, things are true because we believe them."
 
Replies continue below

Recommended for you

"Range" generally refers to the "calibrated range" of a device. For example a dP meter will have a range like 0-100 inHg. The instrument will probably give you meaningful data up to some much higher number, but that data is outside the calibrated range so the instrument has to call any number over 100 inHg "100 inHg".

"Span" and "calibrated span" are the same as "range" and "calibrated range".

"Full scale" doesn't really have a meaning, it is often used as a synonym for "range", but sometimes it refers to the maximum value that an instrument can exhibit (for example the dP meter calibrated 0-100 inHg above might have the ability to be calibrated 0-1000 inHg).

"Max dP" is a nameplate value that is the largest calibrated range the instrument can have. Most pressure instruments have an uncertainty based on calibrated range. So if I have a 0-1000 inHg (nameplate) instrument that is +/-0.5% and I calibrate it to 0-1000 inHg then every reading is +/-5.0 inHg. If I take that same exact instrument and calibrate it to 0-100 inHg then every reading is +/-0.5 inHg.

David
 
"Accuracy" is the term I have most heartburn over in all of this. Technically it is made up of both "uncertainty" and "repeatability", but how you combine them into "accuracy" is the subject of much murkiness.

I've never seen an instrument (especially not a turbine meter) that the uncertainty was presented as a percent of reading. That would say at the very bottom of the flow range (where the meters are least effective) the "accuracy" of a +/-1% uncertainty device would be reported as something like 0.1 gpm +/- 0.001 gpm. At the top of the range (where the meters tend to be decent) the same meter might be 500 gpm +/- 5 gpm.

I've never bought an instrument that was that hard to pin down. The turbine meters that I've purchased came with a calibration certificate that showed uncertainty as a percent of calibrated range, and repeatability through multiple flow tests. The absolute value of uncertainty did not change from near zero to near 100% of flow.

David
 
Try here
Turbines have a linear range within which the "accuracy" (uncertainty) is a % of reading. They also usually show a minimum repeatable flow rate range. This is before we get into "smart" amplifiers which provide linearisation i.e. instead of a single meter factor will plot meter factor as against frequency.
VA meters, on the other hand, are commonly quoted as % FS. So 1% FS means that if full scale is 100l/min the error throughout the range is 1/l/min. At 10l/m you have a 10% error.
Higher accuracy versions may quote both a %fs and a % reading error to be used in combination.

Then again you have to beware of conventions. The convention for viscosity sensors is to quote % calibrated range. Since most have only a single calibrated range and a rudimentary calibration at that (because they usually operate as controllers with a single set point viscosity) there is no need for better. So the convention, when applied to more sophisticated sensors, rather obscures the true performance which is % reading.
There is a big difference between the two.


JMW
 
Interesting, the graph says%of reading, but shows a constant uncertainty over an increasing flow rate. I think that the smart boys at Cameron are outsmarting themselves. The meter uncertainty is only a constant when it is based on range instead of reading.

David
 
This is a common representation of meter accuracy for turbine meters.

PD meters exhibit a similar characteristic curve. The "accuracy" or linearity declared is usually in the order of 0.25% of reading. Some get to 0.1% of reading. (The multi-piston meter is one of the most accurate of all PD meters, possibly of any meter).

Incidentally, the Cameron Turbines were originally designed by Milt November and manufactured by ITT Barton. These are a pretty good meter.

Vortex meters also quote a linear range and an "accuracy" of similar magnitude. The early Eastech Vortex meter was only claimed to be 1% of reading accurate but this was not simply a function of the linear range of the meter (the same meter factor over the meter range) but also a function of manufacturing tolerance as all vortex meters of the same size and duty had the same theoretical meter factor applied and they were simply tested to see if they came within the specification.

JMW
 
I used some of the early Vortex meters. Rosemount was flogging them then and they were positioning the meter as a gas meter, but didn't have any way to correct for flowing pressure so if you used them you had to grab the output (which was in gpm) and convert it to ACF, then calculate a density and convert the flow to SCF. After all that fun and games the meter was rarely within 3% of an AGA-3 meter.

I've been seeing them recently on produced water dump lines and while they're pretty prone to sand cutting, if you can keep the sand out they've giving folks +/- 2-3% which in produced water is about an order of magnitude better than anyone else is doing.

David
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor