229k views
21 votes
When scientist want to measure the brightness of a star while considering the distance from the observer what are they going to

measure?
O Absolute brightness
O Apparent brightness
O Absolute magnitude
O Apparent magnitude

User Mhinton
by
3.4k points

2 Answers

8 votes

Answer:

apparent brightness

Step-by-step explanation:

thats it not sure buttt

User Susana Mar Flores
by
3.7k points
4 votes

Answer:

This wide-field view of the sky around the bright star Alpha Centauri was created from photographic images forming part of the Digitized Sky Survey 2. The star appears so big just because of the scattering of light by the telescope's optics as well as in the photographic emulsion. Alpha Centauri is the closest star system to the Solar System. Image released Oct. 17, 2012. (Image credit: ESO/Digitized Sky Survey 2)

A glance at the night sky above Earth shows that some stars are much brighter than others. However, the brightness of a star depends on its composition and how far it is from the planet.

Astronomers define star brightness in terms of apparent magnitude — how bright the star appears from Earth — and absolute magnitude — how bright the star appears at a standard distance of 32.6 light-years, or 10 parsecs. (A light-year is the distance light travels in one year — about 6 trillion miles, or 10 trillion kilometers.) Astronomers also measure luminosity — the amount of energy (light) that a star emits from its surface.

Measuring star brightness is an ancient idea, but today astronomers use more precise tools to obtain the calculation.

From Greek to modern times

More than 2,000 years ago, the Greek astronomer Hipparchus was the first to make a catalog of stars according to their brightness, according to Dave Rothstein, who participated in Cornell University's "Ask An Astronomer" website in 2003.

"Basically, he looked at the stars in the sky and classified them by how bright they appear — the brightest stars were 'magnitude 1,' the next brightest were 'magnitude 2,' etc., down to 'magnitude 6,' which were the faintest stars he could see," Rothstein wrote.

Human eyes, however, are not very discerning. Large differences in brightness actually appear much smaller using this scale, Rothstein said. Light-sensitive charged-coupled devices (CCDs) inside digital cameras measure the amount of light coming from stars, and can provide a more precise definition of brightness.

Using this scale, astronomers now define five magnitudes' difference as having a brightness ratio of 100. Vega was used as the reference star for the scale. Initially it had a magnitude of 0, but more precise instrumentation changed that to 0.3.

Orion is the brightest and most beautiful of the winter constellations. Some of its stars, including Betelgeuse and Rigel, are among the brightest stars. (Image credit: Starry Night Software)

Apparent magnitude vs. absolute magnitude

When taking Earth as a reference point, however, the scale of magnitude fails to account for the true differences in brightness between stars. The apparent brightness, or apparent magnitude, depends on the location of the observer. Different observers will come up with a different measurement, depending on their locations and distance from the star. Stars that are closer to Earth, but fainter, could appear brighter than far more luminous ones that are far away.

"It is the 'true' brightness — with the distance dependence factored out — that is of most interest to us as astronomers," stated an online course on astronomy from the University of Tennessee.

"Therefore, it is useful to establish a convention whereby we can compare two stars on the same footing, without variations in brightness due to differing distances complicating the issue."

Step-by-step explanation:

HOPE IT HELPS

User Stephen J Barker
by
3.8k points