13.9k views
1 vote
An infrared heater for a sauna has a surface area of 0.050m2 and an emissivity of 0.84. What temperature must it run at if the required power is 360 W? Neglect the temperature of the environment.

a) 425 K

b) 500 K

c) 600 K

d) 700 K

User Zanzu
by
8.2k points

1 Answer

4 votes

Final answer:

To find the temperature at which an infrared heater must run, use the Stefan-Boltzmann law. Given the power required (360 W), surface area (0.050 m²), and emissivity (0.84), the temperature is calculated to be approximately 600 K.

Step-by-step explanation:

The temperature that the infrared heater must run at can be found using the Stefan-Boltzmann law, which relates the power emitted by a black body to its temperature. The law is given by the formula P = ε·σ·A·T^4, where P is the power, ε is the emissivity, σ is the Stefan-Boltzmann constant (5.67 x 10^-8 W/m²K´), A is the surface area, and T is the absolute temperature in Kelvin.

Plugging in the known values, we get 360 W = 0.84 · 5.67 x 10^-8 W/m²K´ · 0.050 m² · T^4. Solving for T yields a temperature of approximately 600 K, which is answer choice (c).

User Nikhil Jadhav
by
7.9k points