54.1k views
0 votes
Star A has apparent magnitude 1 and star B has apparent magnitude 14, how many times brighter is star A than star B? (in apparent brightness)

User Drembert
by
8.4k points

1 Answer

4 votes

Final answer:

Star A with an apparent magnitude of 1 is about 22,627 times brighter in apparent brightness than star B with an apparent magnitude of 14, calculated by raising 2.5 to the power of the magnitude difference (2.5^13).

Step-by-step explanation:

According to the magnitude system established in the nineteenth century, a difference of 5 magnitudes in apparent brightness corresponds to a brightness ratio of 100:1. Having a difference of 13 magnitudes between star A with an apparent magnitude of 1 and star B with an apparent magnitude of 14, we can calculate the brightness difference. We use the fact that each magnitude difference of 1 corresponds to a brightness ratio of approximately 2.5 times. Therefore, to find how many times brighter star A is compared to star B, we raise 2.5 to the thirteenth power.

The calculation would be 2.5^13 which equals approximately 22,627. Therefore, star A is about 22,627 times brighter in apparent brightness than star B.

User Msouth
by
7.9k points