Emerson 3098 Universal Remote User Manual


 
3098 Technical Manual Performance Optimisation
A-1
Appendix A
Performance Optimisation
A1. Introduction
The 3098 Specific Gravity Meter uses a vibrating element density sensor that is located within a pressure regulating
system. The arrangement is such that the density output signal can be directly related to the specific gravity or relative
density of the gas.
Operation of the meter involves charging a reference chamber to a defined pressure and then calibrating the output signal
by using gas samples of known relative density. In order to reduce the effect of systematic errors associated with the
density sensor and the non-ideal behaviour of gases, a number of procedures must be carefully followed. The procedures
listed in this document should form the basis for more specific and clearly defined user procedures. Reference should
also be made to the calibration details given in Section 2 of the manual.
A1.1. The Density Sensor
The vibrating cylinder density sensor is able to measure the density of gases with very high resolution and accuracy. Its
two major potential error sources are temperature coefficient and a gas composition influence due to the effects of the
velocity of sound in the gas.
The effect of the sensor temperature coefficient is directly related to the operating density and hence the operating
pressure. If the operating pressure is doubled, the effect is halved.
The gas composition influence is substantially related to the relative density of the gas and not its operating condition. In
consequence, this effect is substantially eliminated by the calibration procedure. However, best results are achieved if the
calibration gases are of similar type to that of the sample gases.
A1.2. The non-ideal behaviour of gases
This behaviour will affect the operation of the measurement system since the measurement of density at the operating
condition is not only related to the relative density of the gas but also to its compressibility factors. The consequences of
this characteristic are as follows:
a. If the operating temperature changes, so will the value of the compressibility factor and this would be seen as an
instrument temperature coefficient. However, if the reference chamber contains a similar gas, the Z factor
(compressibility) changes are self-cancelling and hence no resultant effect materialises. For this reason, and if a low
system temperature coefficient is required, it is important that the reference chamber gas is similar to the
measurement gas. Operating at low reference chamber pressure will also reduce this effect.
b. Any compressibility factor differences between the calibration gases and the sample gas will be seen as measurement
offsets. In consequence, it is important that the calibration gases do closely represent the major constituents of the
sample gas or that the calibration procedure makes allowances for any such offsets. Since compressibility factors are
related to operating pressure, it follows that this offset is minimised when operating at low reference chamber
pressures.
A1.3. Selection of reference chamber pressure
The reference chamber pressure must always be above the vent pressure to ensure sample gas flow. If venting to
atmospheric pressure, this means that the reference chamber pressure should be above 1.2 bar absolute and below the
maximum of 7 bar absolute. The actual pressure should be selected to give minimum measurement errors due to
temperature changes and calibration method. To summarise:
To minimise density sensor temperature coefficient - use high pressure
To minimise Z changes with temperature - use low pressure
To minimise Z effect on calibration - use low pressure
To minimise errors in readout electronics - use high pressure