161k views
2 votes
If the mean temperature of the surface of the Earth increased from 287 K to 290 K, by what factor would the radiated energy from the Earth increase?

Hint: You will need your calculator. However, you do not need to have the actual surface area of the Earth or the value of the Stefan-Boltzmann constant. Calculate the ratio of the new higher radiation to the old lower radiation. So, for example, if we were to radiate 50% more at the higher temperature than the lower temperature, the answer would be 1.5 (1.5 times as much power radiated). If the temperature doubled, the answer would be 16 because the power of light depends on the temperature raised to the 4th power, and 2⁴ = 16.

1 Answer

3 votes

Final answer:

If the Earth's temperature increases from 287 K to 290 K, the radiated energy from the Earth would increase by a factor of approximately 1.04, indicating a 4% increase, as calculated using the Stefan-Boltzmann law.

Step-by-step explanation:

When the mean temperature of the surface of the Earth increases, the amount of energy it radiates also increases. According to the Stefan-Boltzmann law, the power radiated by a black body is proportional to the fourth power of its absolute temperature (T). If the Earth's temperature increases from 287 K to 290 K, we can find the factor by which the radiated energy will increase by taking each temperature to the fourth power and then finding the ratio of the two.

The original power radiated can be written as (287 K)4 and the new power radiated as (290 K)4. Using a calculator to solve the mathematical problem completely:

Original power = (287)4 = 678,812,673

New power = (290)4 = 707,281,000

The ratio of new power to old power will be 707,281,000 / 678,812,673 = 1.0419 approximately. Therefore, if the Earth's surface temperature goes from 287 K to 290 K, the amount of energy radiated would increase by a factor of about 1.04, meaning a 4% increase.

User Rvy Pandey
by
8.6k points