156k views
4 votes
A random star has a surface temperature of 7300 K and a radius 100 times that of the Sun. Calculate the luminosity of the star compared to the Sun. Estimate what it’s apparent magnitude would be if it was only ∼ 1 pc away like the nearest stars. Compare this to other night sky objects.

User Cwa
by
6.2k points

1 Answer

2 votes

Answer:


(F')/(F) = 5.07*10^(-12)

Step-by-step explanation:

given data:

surface temperature 7300 K


R_(star) = 100R_(sun)

we know that


L = \sigma A T^4

Where


\sigma = 5.67*10^(-8) wm^(-2) k^(-4)

A area of illuminated surface
= 4 \pi R^2

T = temperature of surface

WE KNOW THAT


(L)/(L_O) = [(R)/(R_(sun))]^2+[(T)/(T_(sun))]^4

T = 7300 K


T_(sun) = 5778 K


(L)/(L_O) = 100^2 * [(7300)/(5778)]^4

L = L_O * 2.54*10^4

WE KNOW THAT

r = 1 parsec, pc = 3.085*10^{16] m

luminosity decrease when move away from the core


F' = (L)/(4 \pi r^2)


= (L)/(4 \pi R^2) *[(R)/(r)]^2


F' = F  *[(R)/(r)]^2


(F')/(F) = [(6.95)/(3.085)]^2*10^(-12)


(F')/(F) = 5.07*10^(-12)

User Sepang
by
6.2k points