Additional reading from www.astronomynotes.com
The flux (or apparent brightness) of a light source is given in units similar to those listed on the previous page (Joules per second per square meter). In this set of units, or in any equivalent set of units, the more light we receive from the object, the larger the measured flux. However, astronomers still use a system of measuring stellar brightness called the magnitude system that was introduced by the ancient Greek scientist Hipparchus. In the magnitude system, Hipparchus grouped the brightest stars and called them first magnitude, slightly fainter stars were second magnitude, and the faintest stars the eye could see were listed as sixth magnitude. If you notice, the magnitude system is therefore backwards–the brighter a star is, the smaller its magnitude.
Our eyes can detect about a factor of 100 difference in brightness among stars, so a 1st magnitude star is about 100 times brighter than a 6th magnitude star. We have preserved this relationship in the modern magnitude scale, so for every 5 magnitudes of difference in the brightness of two objects, the objects differ by a factor of 100 in apparent brightness (flux). If object A is 10 magnitudes fainter than object B, it is (100 x 100) or 10,000 times fainter. If object A is 15 magnitudes fainter than object B, it is (100 x 100 x 100) or 1,000,000 times fainter.
Remember that an object’s apparent brightness depends on its distance from us. So, the magnitude of a star depends on distance. The closer the star is to us, the brighter its magnitude will be. That is, the apparent magnitude of a star is its magnitude measured on Earth. However, astronomers use the system of absolute magnitudes to classify stars based on how they would appear if they were all at the same distance. If we know the distance to that star and calculate what its apparent magnitude would be if it were at a distance of 10 pc, we call that value the absolute magnitude for the star. In this system:
- If a star is precisely 10 pc away from us, its apparent magnitude will be the same as its absolute magnitude.
- If the star is closer to us than 10 pc, it will appear brighter than if it were at 10 pc, so its apparent magnitude will be smaller than its absolute magnitude.
- If the star is more distant than 10 pc, it will appear fainter than if it were at 10 pc, so its apparent magnitude will be larger than its absolute magnitude.
The apparent magnitude of a star has an equivalent flux, or apparent brightness. The absolute magnitude of a star is equivalent to its luminosity, since it gives you a measurement of the brightness at a specified distance, which you can then convert into the amount of energy being emitted at the surface of the star.
Because the magnitude system is backwards (brighter object = smaller magnitude), it can be confusing. For this reason, we will not use magnitudes in this course, and I would even recommend not using it in your own courses. Instead, I will continue to refer to the apparent brightness or flux of an object to mean the measurement we make of its brightness on Earth, and the luminosity of an object to refer to the intrinsic amount of energy it emits. However, you should be aware of the existence of the magnitude system because you are likely to see it used in most astronomy publications you read during this course.
Want to learn more?
If you have a strong desire to learn the magnitude system for your own benefit, I recommend the discussions at the following locations:
- Cornell's "Curious About Astronomy" site
- Windows to the Universe