173k views
5 votes
Air enters the compressor of a test facility at 80 °F and 14.5 psia. It is then compressed isentropically to 500 psia and transported to the test cell. At the cell inlet some of the gas is pulled off in a tube for humidity measurements. The tube is maintained at a temperature of 200 °F into the test cell where the sensor is located. Is that line temperature sufficient to ensure a successful measurement of the cell entrance humidity?

User AnthonyF
by
8.5k points

1 Answer

1 vote

Answer:

The temperature difference between the tube temperature and compressed air temperature of 457.979 K is very large which can impact on the accuracy of the humidity measurement such as error magnification and sensitivity to rapid changes

Step-by-step explanation:

For isentropic compression, we have;


(p_(1))/(p_(2)) = \left [(T_(1))/(T_(2)) \right ]^{(\gamma)/(\gamma -1)}

Where:

p₁ = Initial pressure = 14.5 psia

p₂ = Final pressure = 500 psia

T₁ = Initial temperature = 80 °F = ‪299.8167 K

T₂ = Final temperature (Required)

Tube temperature = 200 °F = ‪366.4833 K

γ = The ratio of the specific heats of the gas, cp/cv = 1.4 for air.

Plugging in the values we have;


(14.5)/(500) = \left [(299.8167)/(T_2 ) \right ]^{(1.4)/(1.4 -1)}


\left [(299.8167)/(T_2 ) \right ]= 0.364

T₂ =824 K

Therefore, the temperature difference between the tube temperature and compressed air temperature which is 824.46 K - 366.48 K = 457.979 K is very large which can impact on the accuracy of the humidity measurement such as error magnification and sensitivity to rapid changes.

User Mrgnw
by
7.9k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.