210k views
3 votes
By what percentage would an earth-orbiting satellite average orbital radius need to be decreased in order to reduce its orbital period by a factor of one-half?

User Reos
by
2.9k points

1 Answer

3 votes

The period of a satellite that is orbiting the earth is given by,


T=2\pi\sqrt[]{(r^3)/(GM)}

Where G is the gravitational constant, M is the mass of the earth, and r is the radius of the satellite.

Let us assume that the period of the satellite is decreased to half its value.

Then the new period is given by,


\begin{gathered} T_n=(T)/(2) \\ =(2\pi)/(2)\sqrt[]{(r^3)/(GM)} \end{gathered}

On simplifying the above equation,


\begin{gathered} T_n=2\pi\sqrt[]{(r^3)/(4GM)} \\ =2\pi\sqrt[]{\frac{(\frac{r}{\sqrt[3]{4}})^3}{GM}} \\ =2\pi\sqrt[]{(r^3_n)/(GM)} \end{gathered}

Where r_n is the decreased radius of the orbit of the satellite.

The value of r_n is,


\begin{gathered} r_n=\frac{r}{\sqrt[3]{4}} \\ =0.63r \end{gathered}

Thus for the period of the satellite to be reduced to one-half of its initial value its radius should be reduced to 0.63 times its initial velue.

To calculate the percentage decrease of the radius,


\begin{gathered} P=(r-r_n)/(r)*100 \\ =(r-0.63r)/(r)*100 \\ =(1-0.63)100 \\ =37\% \end{gathered}

Therefore to reduce the orbital period of a satellite by a factor of one-half, its radius should be decreased by 37%.

User Akinuri
by
3.5k points