199k views
3 votes
A heat pump has a coefficient of performance that is 50% of the theoretical maximum. It maintains a house at 20°C, which leaks energy of 0.6 kW per degree temperature difference to the ambient. For a maximum of 1.0 kW power input find the minimum outside temperature for which the heat pump is a sufficient heat source.

1 Answer

5 votes

To solve the problem it is necessary to use the concepts related to the coefficient of performance and the concept of the carnot cycle.

According to the statement so that the house has 20 ° C the heat pump must probe 60% of the temperature, that is


\dot{Q_H}=\dot{Q_(leak)}=0.6(T_H-T_L) = COP*\dot{W}

Where,

COP =Coefficient of Performance

W = Work

Q = Heat Exchange

Therefore we have that


\beta = \frac{\dot{Q_H}}{\dot{W}}


\beta = \frac{\dot{Q_H}}{\dot{Q_H}-\dot{Q_L}}


0.5\beta_(carnot) = 0.5*(T_H)/(T_L)

Then substituting into the first equation to get


0.6(T_H-T_L)=(0.5*(T_H)/(T_H-T_L))


(T_H-T_L)^2=(0.5)/(0.6)T_H*1


(T_H-T_L)^2=(0.5)/(0.6)*293.15


T_H-T_L=15.63


T_L = 20-15.63


T_L = 4.4\°C

Therefore the minimum outside temperature for which the heat pump is a sufficient heat source is 4.4°C

User Kris Rice
by
6.6k points